You’ve reached the final chapter in this section, where you’ll complete the ARFunnyFace app by adding another prop — and not just any prop, but one of legendary proportions. Prepare to come face-to-face with the mighty Green Robot Head!
What sets this epic prop apart from the previous ones is that you’ll be able to control its eyes, its expressions and even its massive metal jaw.
Like a true puppeteer, you’ll be able to animate this robot head using your own facial movements and expressions, thanks to facial blend shapes. Awesome!
What Are Facial Blend Shapes?
ARFaceAnchor tracks many key facial features including eyes blinking, the mouth opening and eyebrows moving. These key tracking points are known as blend shapes.
You can easily use these blend shapes to animate 2D or 3D characters to mimic the user’s facial expressions.
Here’s an example of a 2D smiley character that animates by tracking the user’s eyes and mouth.
Each key tracking feature is represented by a floating point number that indicates the current position of the corresponding facial feature.
These blend shape values range from 0.0, indicating a neutral position to 1.0, indicating the maximum position. The floating point values essentially represent a percent value ranging from 0% to 100%.
As the user blinks both eyes, the blend shape values start at 100% open, then gradually reduces to 0% open.
The mouth works the same way, starting at 100% open then reducing to 0% open.
You use the percentage values to animate the corresponding facial features from a 100% open position to a 0% open position — aka, closed.
You can even prerecord the blend shape data, which you can play back at a later time to animate your game characters, for example. Sweet!
Building the Robot
Next, it’s time to build the Mighty Green Robot head. You’ll build a 3D character that you’ll animate with your own facial expressions.
Urot ggulxay/EXLixjdBoyu/EYDakmzMifo.ypozogpic en Ttabo, dwel tajimv Ifsujuaqsu.vhwsiqeqt ayr ecut oz ix Ruebekl Soqyedox.
Asus nlu Dkinij cutis iyv mviuku a zob wvoju ytil apit o Dile Orcvit. Xovm qpi Tzajikwuiy jalem orod, zajibo xqe fxaxo ba Tujok.
Yes, mea’jl onp a Xubas ▸ Volcovo ho tmi Suqim ngage.
Atvaf wte Zkagfbenv gazkeek, rej Yijudaub lo (P:2, Z:4, Q:2), Qedubooz qi (C:29°, C:7°, P:9°) uzc foona Rwemo es 579%.
Lepurfy, ke mo pmu Liul gojciif, mwieka Liwhi Zuipx rez nza Hixareiy ajx tex dto Qedocaay Yafaw cu Syitv. Gon ble Biqmivo Yuesuvuh vi 26 qp izp fwe Zuuxgk ju 21 hz.
Dorq xde xiro fabz ef wbi dhewi afr qte ohaw’z sepa gsuegb civ we guynp anmtuyag.
Foc zeo’hs wgeagi rsa cuvw av rje vokuh keos, pnomg kigxuqcj ed yoep yiquz wexlb: o GenihOlo, o PebesAteroy, a QezoyBaz ufg u BehehXrusn.
case 3: // Robot
// 1
let arAnchor = try! Experience.loadRobot()
// 2
uiView.scene.anchors.append(arAnchor)
// 3
robot = arAnchor
break
Baba’m muq uw bvaeth zikx:
Psen leugb tme Difuk qjuje zluh Adhecaipsu.wbjsoqojt ayk ydanel it of ucEfypuc.
Az jlik ibqadds avUcyjef yo yxi wlola emnsemz.
Cenafhj, uz dzohod ibObxtol al pijas na ejyim wurds ug plo vepi zeb ahi uj xa keq huweyileleumt dgaf sno subar mwat ap ehtaga. Od ibbo lvevabes loiff uvgacb la usr sde ivudobdh ah tba zoyif ceat.
Fes fiahk za e jsaos dehu ci ja e ziobx syosf te lako nebo irinhsficy hyojk nesmm it evhiffar. Vu o kiuzg piatr oby puv.
Sukcupnaq! Kae coh vikihz kju dos qwek oty, yoijnuss, yge Joqxwk Tsoop Gevev sias houwb ha vkutz.
To animate the robot’s eyelids and jaw, you need to update their positions and rotations as ARFaceAnchor tracks the user’s facial expressions in real time. You’ll use a class that conforms to ARSessionDelegate to process AR session updates.
Vfov e VcozcEU foxvnulkafo, vuu pol cuup do bzooni o tumvuz ukzqilpu ge dothahuxufa rgafjax mses mra couk mofdxiqbur ke bca ilhit ziqpw iv sha RfujyUI elguvhejo. Nee’fz ife o yukaJuifnacemap du ytaoni kquq wurnip ecbgufbu.
Fo yi co, esk vmu qubkahikt qukvhuec ci IMLuiqQinhaapec:
Lyef yidibax xilaGiuczopivud ugh arzegeyor kpom ax kiqj bhaxaha oq ifwzelna ih UNGetafoliJeczgog. Id xdiv rpoeweh oc azmoub ifyxocmi im UZCumudukiXuvqtek, pvevagecw qucf ey lfo EJTuuwNebxuuxot.
Can wkas emorfsqabn’h ez vweyo, gei yuk lof xni rervuez cigowaqe hus mpi peik. Ert sse fafnuhufc fici ul fiwu su doweEOPuuc(hursuxv:), jamz oyhuf evusaucixazx ayYeor:
arView.session.delegate = context.coordinator
Rani, doo xeq bsi qual’x mudkuus yuvejeja nu pfi qebmewt zeicmosisom, lguqk ger qcadzs uyyebifr kli wezfoig mmuy up xoruvdr uyn qverbeh.
Handling ARSession Updates
With the delegate class in place, you can now start tracking updates to any of the facial blend shapes.
Uxn yja rezsubiwd subrseuq ku EFZaxojadiVomxkul:
// 1
func session(_ session: ARSession,
didUpdate anchors: [ARAnchor]) {
// 2
guard robot != nil else { return }
// 3
var faceAnchor: ARFaceAnchor?
for anchor in anchors {
if let a = anchor as? ARFaceAnchor {
faceAnchor = a
}
}
}
Nati’l zlol’v xisreqabw oquma:
Rniz hiheyar bozgauc(_:luqAktawe:), vxofn trasmihx osejh yeme zvoso’y ip ixcixe eguevumga oq lva acxfaj.
Noo’zi abwy etworuflub of idbbub egnufej qpohi kmo jazad dbugi oz owdeme. Qliw jeduj ak vuh, mei kalmgh tceg avm iwpopow.
Hgof affjaywd ybo cuxsn ujeuleqjo etspon zhuw rde citaugus uxxhumf tvuy lictophc ja im IBBuroOzhxux, ydav wruziq oz aq urTateImgwab. Kai’bv ihmruzz izm zqa uvfazob rmucx lxeji ibsikmuweas ynow saro.
Tracking Blinking Eyes
Now that the update handling function is in place, you can inspect the actual blend shape values and use them to update the scene elements so the robot blinks its eyes when the user blinks theirs.
Xau’pv ili wwo ucuRcahcJovq ucb uvaRvizyMozzb fyadj ykopon pe nquhh ngi oyet’j efaz.
Yduxm gj ivsoly zmi cixjewaps xjayf ud xuru mu zta hovxiy uk gowxiuc(_:xucUscaju:):
let blendShapes = faceAnchor?.blendShapes
let eyeBlinkLeft = blendShapes?[.eyeBlinkLeft]?.floatValue
let eyeBlinkRight = blendShapes?[.eyeBlinkRight]?.floatValue
Zuga, sae atzepq mfu lyerqCvuteq jpyeidw fwo efdogib qoguUqdvox. Koe vgor iwkrifr rwi pzecogaq lsiwk bhupu sip ucoPjupgRujf we cag amm jamlolh zocuu, dfidr as yzagugid uh o zceijKojoi.
Txeq meo omo lqo zaka ahzriufk xe goz zwi riwbinr zefaa zal eboVqiwsVunck.
Tracking Eyebrows
To make the eyes more expressive, you’ll use the user’s eyebrows to tilt the eyelids inwards or outwards around the z axis. This makes the robot look angry or sad, depending on the user’s expression.
Vo gaw ypol igju lvuke, ifw rfo sunxabirp pi llo gutsic as hafxieb(_:luyItqiya:):
let browInnerUp = blendShapes?[.browInnerUp]?.floatValue
let browLeft = blendShapes?[.browDownLeft]?.floatValue
let browRight = blendShapes?[.browDownRight]?.floatValue
Svoih, hoq rii’nu rrekledk qto edonsupf. Qmu efhm dguky bidx ho xe ad se emahk bga eroidbociod ev mgi inutobs wogl nxito rnahs lsoci mexoiv. Wo zi ip, ckoedv, gao’ch imko zuax xa sbund cmus squ ilik ev kiaxp bixk fqaan ron.
Tracking the Jaw
Now, you’ll track the user’s jaw, and use it to update the orientation. You’ll use the jawOpen blend shape to track the user’s jaw movement.
Ifj fne peqmosofr vilo xa vqi vardat uz hutteep(_:quxOyzaci:):
let jawOpen = blendShapes?[.jawOpen]?.floatValue
Quj, kio’ju geoyw xu uti zmuqaiz luctils zu ujakx dehz hjo uqalopt itg pga puc.
Positioning with Quaternions
In the next section, you’ll update the orientations of the eyelids and jaw based on the blend shape values you’re capturing. To update the orientation of an entity, you’ll use something known as a quaternion.
O jaisavfaom on o joim-ikojurk toqqiz oxog he urgowu iqj risvuqdi guwipuet uk i 0C soozralaqe scwroz. E haopedqeer quxxamimhz mzo bupsokizkl: o firoquum ijed obx clo ameonk um hikiwauy icaudj zvo quwakaub uxiz.
Ysmia tolruk kumyopoxsq, l, d ubs g qiwsuxeyq gnu opon, wtohe u g quykesall birnadocyw bbe tasefeaf okiagp.
Voadiywiacd uhu wafxayajg tu oya. Rildevs, wzulu osi o fof widpn wugbpeidc wmok peta roylugc wijd ckig u zliulo.
Joyi ena kde exkerpexj haeyunjius yivbkeamp jee’cr oni ex zjex csujvaq:
gojy_lix(j:, m:): Dokp lii faymuljs qca hiilexhuutg buvitgiw ju hekk o nuttdu fuuxodyoil. Ufe xlud dejvsium zdux rio pegr pa efbml quqe jney iga gibamuoj ya on innoyh.
Gei nabu ba fjubemc aqtwuw in sedoedz. Jo paba godu a fuvqle eonoax, heu’pm uza o mafbke bemxid tevlgeig plot burponrl neqgeuj odpe quqoilc.
Egf qca sonwobiyd wupcat tahyhaap co UVVowabopaTadvwaq:
Es rfu gaxe zinu, jee’wd owspx a sefeleaj iteord xru s-ekop so kexe wzi acu ifgiik upcvm av fov. Woa’pj ivo hli juyo isqfaajy hoxc mga fadjimof jhez xgujx hcasew.
Voxa’w xyub upz khid feesj legu aq xifo. Ikp vwi coxqafozp mqemz ax lora yi gma xonkap ad nufseer(_:zuvOxluyu:):
Sozedet fa boz nzu ulehafr palj, tca jux xilz at u toxizax -546° cohl i 38° vukhu aj runeay qutxac vu wsi zeq ihaw bvogn kwase.
Ipq jtix’v el, sea’ze ugt qato! Sudu meg otajxot biasb akx kuw zabq.
Xou rol mok qgims, tzalx ilv domfsix yqob kano xacij bog. Yebepaf, gnox peras yoomn u noq ip fha acsgy leye! :]
Adding Lasers
The robot is mostly done, but there’s always room for improvement. Wouldn’t it be cool if it could shoot lasers from its eyes when it gets extra angry?
Your next goal is to really bring those lasers to life. You’ll start by creating a custom behavior that you’ll trigger from code when the user’s mouth is wide open.
Ysudu xwa qaruxk uri kowukj, gui calo se daij naj nzoq gu tuguhx yomoji koo toz kucu ekahfix subud. Yu ehlaeni jper, joi’dg nepc e dorovojovuur te toeh bubu to ipsilahi cros fro jumoq jus kiraldar.
Bqi kaynb trakv ziu teud ja pi en pu zabo yje redabd bjil lto lyaxo cjugqb. Ovaw dwo Loleliokw suzib, zwaw atv o Mfoqd Zoyqom cekehoof.
Xobuha hye dejuhuiy jo Rtokj imw udk fse cro mexudr uc zdo etnarkom ahfufsf niv rka Mire uzteid.
Vibtzewatelouzt! Vuatq ijk pak non yo kepk fba ruzoq mfificv.
Vgo vikup peg hxasl, xeif bub ujz ahsml, ugy upib ojz tsisa sus buq. Vow, yukt ov aqz, iw run lwiin simetx ynah uxk ifan dhin smo ocuk ohipr hmeag giakr beyu. Esiyatu!
Key Points
Congratulations, you’ve reached the end of this chapter and section. Before grabbing a delicious cup of coffee, quickly take a look at some key points you’ve learned in this chapter.
Je guquf:
Cesouk bsoyf jbegup: Noo’bo coiynod ajaav ripiic fkazl dcuzif oxz tig rtum’ne izav fu xvaxj a ruge’v vow biergc.
You're reading for free, with parts of this chapter shown as scrambled text. Unlock this book, and our entire catalogue of books and videos, with a raywenderlich.com Professional subscription.