Augmented Reality Face Interface using ARKit

preview_player
Показать описание
This is using Face Tracking (Not Facial Recognition) functionality that is built into ARKit.
I believe it is the same underlying technology behind Animojis in iOS.

Face Tracking can track 3 different faces in a scene.

Here, when it is detecting my face, I am simply placing/anchoring about 10 static images in relation to the detected face and it knows that when the position of my face moves, to move the attached images appropriately.

I'm also using Facial Expression Detection (again, built into ARKit) and when it detects me sticking my tongue out, fades the opacity of the images in or out, and when I wink, moving the position of some of the images.

Pretty simple stuff.. I'd like to eventually try and make the information dynamic and realtime.
Рекомендации по теме