filmov
tv
Humanoid Robot Control Using Human Joint Angles via 2D Camera

Показать описание
I developed a MediaPipe interface extracting human upper body joint angles, head rotation and hand open/close status from 2d camera (an ordinary laptop camera) to control humanoid robot NAO and Pepper in PyBullet simulation.
ABSTRACT
Control over humanoid robots is necessary to facilitate them in many fields. With Robot Learning from Demonstration (LfD), humanoids can be taught and trained in the physical workloads that humans have to do. Here, it is the feasibility of controlling humanoid robots that holds utmost importance. The goal of this research is to investigate how feasible it is to operate a humanoid robot using human joint angles captured by a 2D camera. In order to handle embodiment mismatch, we use direct mapping of human joint angles on a humanoid robot. Our method does not require depth information on the 2D image to transfer the angle of abduction/adduction of human joint movements. We utilise a similar generic method to get the angle of flexion/extension of human joint movements. The external/internal rotations of human joint movements which stand for yaw control of humanoid robot joints are not included in this research. The evaluations are made by comparing angles obtained by our approach with ground truth angles and experimenting with 16 participants. The evaluations of our approach suggest that humanoid robots imitate abduction/adduction human movements quite successfully when compared to those of flexion/extension. Our work reveals that controlling a humanoid robot using human joint angles captured by a 2D camera is feasible. Finally, we implement an interface of our approach that provides control on our approach’s output of head nod and tilting movement angles, open/closed hand status classifier, and human upper body joint angles based on human joint 3D coordinates captured by a 2D camera.
Source Code:
ABSTRACT
Control over humanoid robots is necessary to facilitate them in many fields. With Robot Learning from Demonstration (LfD), humanoids can be taught and trained in the physical workloads that humans have to do. Here, it is the feasibility of controlling humanoid robots that holds utmost importance. The goal of this research is to investigate how feasible it is to operate a humanoid robot using human joint angles captured by a 2D camera. In order to handle embodiment mismatch, we use direct mapping of human joint angles on a humanoid robot. Our method does not require depth information on the 2D image to transfer the angle of abduction/adduction of human joint movements. We utilise a similar generic method to get the angle of flexion/extension of human joint movements. The external/internal rotations of human joint movements which stand for yaw control of humanoid robot joints are not included in this research. The evaluations are made by comparing angles obtained by our approach with ground truth angles and experimenting with 16 participants. The evaluations of our approach suggest that humanoid robots imitate abduction/adduction human movements quite successfully when compared to those of flexion/extension. Our work reveals that controlling a humanoid robot using human joint angles captured by a 2D camera is feasible. Finally, we implement an interface of our approach that provides control on our approach’s output of head nod and tilting movement angles, open/closed hand status classifier, and human upper body joint angles based on human joint 3D coordinates captured by a 2D camera.
Source Code: