Adapting characters to use real-time facial capture | Unreal Engine

preview_player
Показать описание
The Live Link Face app streams high-quality facial animation in real time from your iPhone directly onto characters in Unreal Engine. But on some occasions, the characters where we hope to transfer these animations do not fully comply with the basic recommendations, although with some ingenuity we can solve this.

In this video we will take advantage of the Meerkat demo project to animate this adorable pet developed by Weta Digital.
Рекомендации по теме
Комментарии
Автор

Everyday, I get more and more scared. Unreal has meerkats now... and they can talk 😳

bev.tarzan
Автор

This meerkat demo has consecutively solved all the problems I've been looking for solutions for

MagicSwordFilms
Автор

Quick tip - if you can't see the Live Link Debugger at 5:50, it is because you didn't toggle on the Live Link Curve UI plugin. ;)

rob
Автор

Great tutorial. Very clear and easily understood. Very creative.

borakian
Автор

Very easy to understand. Good Tutorial 👍🏼

IRONFRIDGE
Автор

so far this is UE4 best features that do not need to rely on other external tools. This is then the right concept of selling UE4 apart from the recent short film event in Australia.

TheAmoscokkie
Автор

but how can we control the eyes? can we see and rotate our eye balls?

KriGeta
Автор

I don't understand why Unreal Engine would push tech onto Apple platforms. What if Epic and Apple fall out?

DabblerDotUK
Автор

If only Epic put like 1% of this effort into improving their game store, it too would be amazing. Now it's a clunky mess.

Seriously impressive stuff shown here.

Wistbacka
Автор

Many thanks, Can we deploy Meerkat on iOS app with fur ?

ankur
Автор

When I add the new curves to rotate the head they have X's next to them instead of a check, and the rotation doesnt work, even though I have it turned on in the app.

MagicSwordFilms
Автор

very difficult and inconvenient to use

tnctr
Автор

Muchas gracias Christian! Eres muy inteligente y muy bien explicado, ojala despues subas un tutorial para animar los ojos tambien!
Saludos!!!

erickmoralvocal
Автор

when I create a human character to animated with blendshape and Niagara grooms,and the model are poke,grooms cannot drive correctly with blendshape, how could I fix this issue?

huangti
Автор

This is awesome!!! I just followed along. Is there going to be a continuation of this?

FeedingWolves
Автор

i dont have an iphone :v
I guess my game will feature expressionless dialogues :3

phantomgaming
Автор

By the time you retarget the blendshapes/morph targets to the character, it looks like you get 5-10% movement. In order to fully take advantage of this tech, you need to blow out the blendshapes to 200% or more (play with it). So by the time you have it retargeted, there's a closer 1:1 ratio of movement on the character.

gollm
Автор

Do you have to have iphone 12 or can you do this with iphone 10 also?

uja
Автор

Well, I don't have an iPhone :-/

Braindrain
Автор

Didn't Epic Games acquire Hypersense? Wasn't Hypersense working on something like this that worked on Android?

sigrid