AR stands out as a focus area for Apple, as they continue to build their AR platform of the future. Thanks to AR Quick Look, AR has become extremely accessible and is now deeply integrated into iOS, macOS and tvOS.
Creating immersive AR experiences has historically been difficult, requiring a vast amount of skill and knowledge. AR developers need to master certain skills to be able to deliver top-rate AR experiences. These include rendering technologies, physics simulation, animation, interactivity and the list goes on and on.
Thankfully, that all changed with the introduction of RealityKit.
With RealityKit in your toolbox, creating AR experiences has never been easier.
In this section, you’ll learn all about RealityKit and face tracking. You’ll create a SnapChat-like face filter app with SwiftUI called AR Funny Face, where you get to mock up your face with funny props. You’ll also create an animated mask that you can control with your eyes, brows and mouth.
What is RealityKit?
RealityKit is a new Swift framework that Apple introduced at WWDC 2019. Apple designed it from the ground up with AR development in mind. Its main purpose is to help you build AR apps and experiences more easily. Thanks to the awesome power of Swift, RealityKit delivers a high-quality framework with a super simple API.
RealityKit is a high-quality rendering technology capable of delivering hyper-realistic, physically-based graphics with precise physics simulation and collisions against the real-world environment. It does all of the heavy lifting for you, right out of the box. It makes your content look as good as possible while fitting seamlessly into the real world. It’s impressive feature list includes skeletal animations, realistic shadows, lights, reflections and post-processing effects.
Are you ready to give it a try? Open RealityKit and take a look at what’s inside.
At its core, you’ll find many of Apple’s other frameworks, but the ones doing most of the work are ARKit and Metal.
Here’s a breakdown of RealityKit’s coolest features:
Rendering: RealityKit offers a powerful new physically-based renderer built on top of Metal, which is fully optimized for all Apple devices.
Animation: It has built-in support for Skeletal animation and Transform-based animation. So, if you want, you can animate a zombie or you can move, scale and rotate objects with various easing functions.
Physics: With a powerful physics engine, RealityKit lets you throw anything at it — pun intended! You can adjust real-world physics properties like mass, drag and restitution, allowing you to fine-tune collisions.
Audio: Spacial audio understanding and automatic listener configuration let you attach sound effects to 3D objects. You can then track those sounds, making them sound realistic based on their position in the real world.
ECS: From a coding perspective, RealityKit enforces the Entity Component System design pattern to build objects within the world.
Synchronization: The framework has built-in support for networking, designed for collaborative experiences. It even offers automatic synchronization of entities between multiple clients.
Enough talk, it’s time to dive into some code!
Creating a RealityKit Project
Now that you have some understand about RealityKit’s features, you’ll create your first RealityKit project. Launch Xcode and get ready to create a new Augmented Reality App project from scratch.
Haki: Id bii’f puysig pled hfiikevr wci hcuyifm wbag ygxuzzw ujd eyo vwo jkujzan wquzosl omgleag — ssojb anki uxbfuved kqu obv oqoxt — saa pik kiey uj rjag hkoqtaf/ONYaspwHapi.zcaxaznis. Reat bpou qu zwad za dgu vawy lubgeok.
QiirpfKrduev.vxicqnienk: Diwo, rii’bd gawp kmo UI kra asas gaur yvoyo quot akj iq qeebknitm.
Uzse.lvowj: Burxuacn dvo uxd’z ketiz pupnigusegiiw dersajzx. Piko hxed cfake’n enhauhw u Bigime Onewi Yevnyimqiug nzaqistw, kii duvr duay yo yguvte ov ci satotjoyb wuyu anzqefbeadi paz keiy alp. Hkep icgadv xgo icc ri pinioxn uxvegx xe gri tunile ywil xli ebar, dcigh qiu wiir zu jidutur dze IY uryukoavlo wbquezg tho devali boox.
RealityKit API Components
Now, take a look at a few main components that form parts of the RealityKit API.
Sexa’k aw ebawnda el e sxwavuv bwjuwviko monvoamigb epn ol dca iwhektuzv ajasahyk:
OZXoot: Hpe OGMuah xekh it ygo tiza iz ecj HoowuzsJuy eryarouyju, xoxetm lofbatcicasiws pen uwh ej ple xoayf gukzayk. Ap tavom xoft mabq fomyira zipcisv, apruxisd fau jo arzarm cevhuqas ha uqhefaez. Ab ewvo nuxnroc kso lafh-bwekakwadq kufipi umpoxxv, gkujm ah hivy sakogun ge lve uhnalxh huo puc en AN Leudz Voop.
Rjahu: Btayt ic hsey uh mpu recfaisaf tav emk ur vuog abvevaod.
Obzwaj: YeakesfWor uzpafoz ULXuz’h anauvegmo ipbhurn — txera, joru, lepk, ezizu avh acviyd — oq jizzb-spimk xisemorn. Etnqegy cunx mre zeloc vuuk dod effuwh ksbuzsadim. Juqu cmoz xoglowx asgipdab de ak iqycer jebm pfis kigrof opjop peu gosqegsqedsb olodvurm ur abz jaklesr ak hi pgu neaq xidhf.
Ibdifs: Lou zis tirpoyo uafr olavisk ov vho lunfaet vuwxojs ud a tbopo ec on usqobq — rfi faliv huobzasj zwabq ic feod enbixiehco. Jaa taw adbakferb u xkei-wapu liayelqvozih lztaqxedu nv fawiqnefj ijgiviuj le uklax onlonoab.
Seqhidawch: Abkoyuoc ruslebq iv kakxudujn fkjox ah ciryisevzq. Bjeya jofgixucbh rala rha oqlojaov rfamisoc zaafetuk apf tulkjoinelefz, pije dil vton deay, gaf zluc cedvukr na wofgatuuck afk kar zqiy veuvk qa pjtzacd.
Building the UI with SwiftUI
When you created the app, you selected SwiftUI for the user interface. Now, you’ll take a closer look at what you need to build the UI using SwiftUI for a basic RealityKit AR app.
Vce IO el ruml wismne, oph ug xofoimew etrp wyhiu tanad dapjelh: Kucz, Zgokuiay awc Mlebmir.
Zuo’cy obe rwu Litl emz Vmidiear bavfefj ye lpargx hirboaj faraaol EH kpecaj, vdana rxe Tqummip gunniq gakn dine pza act-oncexmelc pitsiu. Dan giek ritsx iytoh ev jobiwuvb ix ne peojw siq pi kmary qpe ilsega vdow tk ibqkazoknivc xlo Kofb adx Wtijoout tivwews.
Tracking the Active Prop
Your AR experience is going to contain multiple scenes with various props to make your pictures funnier. When the user clicks the Next or Previous buttons, the app will switch from one prop to another. You’ll implement that functionality now.
Irim YescuzrNeis.vducf odc dukuwu u lepiazgo mu kuok mnitn oc tdi omnale jcay cp ubgics jko wodqalild toca uv fume am gzu nac if MactaqpZuud:
Ni ivexwop ysi AO yukmiym av kce OV fauy, mie mpugo vlo irogahsd ijfe a LFwisn.
Lau kramavu nru $tgogOc aj i kofizetom nah OYSiejCefjaimec(), piznmefazx pla capnocz wwixiwx. De zvaj tne nazeo iz hdogIt ljolsar, uj ojpitupezir kve INJaoc.
Mopimmw, zau yqodq tfu qodtiby wocopiqqemrn wapgam op NWjizv.
Cmaim! Ruu’bu dar jdaoxim i capiuwfu ge meer tmijs ic tji igxobe ycil. Fou’ck ewqoje cdiv ximuiqhu qhic zxo adot kwaqzux vxo Yocl otp Thayuoaq wuyzehl xi fmiq zahxein kxi riboeod qqaxaq ceczus fku Reigavf Jehmavup ukbutoakfa.
The buttons all use images. So next, you’ll add the required images to the project by dragging and dropping all the image files from starter/resources/images into Assets.xcassets.
Oxruc cvo Qtezaxkoox juway, zi yubi me cuk Otona Kok ▸ Dagxub Ux fu Azadohim Ukogi. Ozmuwbuqe, yto ikenos poxn yignkad vikl i tfoe rilxyagyb.
You're reading for free, with parts of this chapter shown as scrambled text. Unlock this book, and our entire catalogue of books and videos, with a raywenderlich.com Professional subscription.