Motion capture using 3D pose estimation with Unity and mediapipe

preview_player
Показать описание

It can capture people motion from a video file and overlay it on a specified 3d model in real time (30fps and more). This demo is recorded using Unity recorder.

============================

Захват движения людей из видео и наложение его на трехмерные модели. Демо собрано на юнити с использование плагина MediaPipUnityPlugin
Рекомендации по теме
Комментарии
Автор

Nice job. Could you please share how you were able to access the actual landmark data once you had the MediaPipeUnityPlugin demo scene running?

infinitr
Автор

Great work! How to animate 3d avatar in unity using 3d pose? Could you share code or guidance about this process?

邓文晋
Автор

I am currently working on a project that shares similarities with yours, and I have encountered an issue. Specifically, I am facing a challenge where the model needs to adjust in size depending on whether a person is approaching or moving away from the camera. I was wondering if you could kindly share any insights on how to overcome this obstacle? Thank you.

QingYuMu-vn
Автор

This is great work! I was wondering how you mapped the values coming out of media pipe to the 3d model? I was able to retrieve the values from media pipe but not sure how to map it to a 3d object.

arnabchaky
Автор

Hi do you have at least pseudo code for this? I'm having problem mapping and getting the rotation, I tried mapping it by getting the PoseAnnotation position but my 3D model stretch its bones not scaling properly.

kimkaye
Автор

hello, sir, nice project, I just wanna know how do you solve the rotation of character, cause when I directly get landmarks data of mediapipe to transfer to Unity, and using the IK sync the character, it moves sucessfully but can't rotate, especially in their waist, (don't have landmark on waist and also don't have rotation data from mediapipe), so how can we solve this? or do I need to send data to Blender firslty (cause blender have the bone rigging rotation setting), then transfer to unity later?

MarkTreeNewBee
Автор

How did you map the landmarks into a 3D avatar? I am able to visualize the landmarks in unity but I have problems creating the skeleton from them

MirageDev_
Автор

Hi, Thank you so much for the great tutorial.

I am going to use the MediaPipe Unity plugin for my project. In that project I need to implement Jump, Run, SwipeLeft, SwipeRight and etc gestures with Mediapipe Pose tracking using a Web camera.

Can you please confirm that the Jump, Run, SwipeLeft, SwipeRight, and etc gestures are possible with Mediapipe pose tracking?

If possible can you please help me or share some of the references for the above gestures implementation using Unity.

Thanks in advance.

queriesask
Автор

Здраствуйте. А как решили задачку с тем, что 2D проекция на экран и 3D поза относительно центра координат между бедер матчатся в пространство камеры? Есть может формула? И да, имползуется ортографическая или перспективная камера?

POTKPOT
Автор

Hello! that looks nice, but i want to ask you why din't you use Barracuda for Unity?

Shian_n
Автор

Does it gives you an animation file after it's done extimating

youtube
Автор

This is amazing! any way to get it to work with DAZ3d?

themightyflog
Автор

Can you please help me, i tried the same but my 3d model in unity just breaks when I try to move the armature based on mediapipe coordinates

amargwari
Автор

how long does it take for mediapiple to define a pose estimation from a video?
I've tried a simple of openCV using cpu, took over 50 mins to estimate 10-12 sec, and the entire pc was lagging the whole time. so it's definitely unreliable
I hope mediapipe is better.
please tell me ur pc spec as well as how long it took to estimate the motion in that video?

llIIllIlIIllX_XIillIIllIIllIll