Metahuman Animator Test on a Stylized Facemesh with Midjourney Texture | Animation Test

preview_player
Показать описание

This is a test for the new Metahuman Animator facial mocap system from @UnrealEngine - and a couple other things. I ran a stylized head mesh through the new Mesh-to-Metahuman process (which is way better at maintaining the stylized shape compared to the old version) and animated with a video take of myself making some exaggerated faces. Note - this has zero mocap cleanup - just the straight take. The head is textured using the Midjourney generative AI - I fed a blank render of the head mesh into Midjourney with a prompt for a handpainted texture, then projection mapped it onto the mesh in Blender, and then handpainted the parts that were messy (the sides and back, inside the mouth etc).
Рекомендации по теме
Комментарии
Автор

Well, this opens a whole new realm of possibilities. Amazing.

JaidevCGartist
Автор

Damn, looks like Telltale Games is making a new Walking Dead game 👍.

SpectralGhost
Автор

This is so good is there somewhere I can see your process it’s so good

geckogaming
Автор

Okay so you rendered a blank mesh, ran that through midjourney and asked it to handpaint it? Then projection mapped it in Blender? Then I guess you ran the head mesh with the projection mapped texture through mesh-to-metahuman? Then you used metahuman animator to animate it? Great work by the way. Just trying to figure our your workflow so I can try the same. Thanks for sharing!

ielohim