Hand Tracking & Gesture Control With Raspberry Pi + OpenCV + Python

preview_player
Показать описание
Identify and track every joint in the fingers of your human hands, live. Then use your human hands to send commands to control media software and GPIO attached hardware. All via a Raspberry Pi Single Board Computer.

Make sure to use the Previous Raspberry Pi 'Buster' OS with this Guide.

Related Information

Machine and deep learning has never been more accessible as this video will demonstrate. Cameras in combination with machine learning create the most powerful sensor you can ever put on a Raspberry Pi Single Board Computer. Today is all about real-time Hand Recognition and Finger Identification via computer vision with our Raspberry Pi single board computer doing all the hard work. The system built here will use Open-CV particularly CVzone. This is a huge package that helps solve real-time computer vision and image processing problems. This system will also be using MediaPipe for real-time Hand Identification, which will run a TensorFlow Lite delegate during script operation for hardware acceleration (this guide has it all!). Check the full guide on how to install these correctly and download the scripts. There are other types of gesture recognition technology that will work with a Raspberry Pi 4 Model B. For instance, you can also do hand identification or gesture identification with Pytorch, Haar cascades, or YOLO/YOLOv2 Packages but the MediaPipe dataset and system used in this guide is far superior. The first script when run will identify any hands seen in front of it through computer vision and then use machine learning to draw a hand framework over the top of any hands identified. The second script will output to the shell a statement on total finger count (both up and down) and specific details of each Finger on whether it is up or down. Third and fourth scripts are all about controlling hardware and software with your hands. The first uses a GlowBit Matrix 4x4. The amount of fingers you show will produce different colours on the matrix. The final script lets you control a VLC media player (play, pause, volume control) all through your fingertips. Gesture Volume Control success! All the scripts are fully open-source and can readily be expanded taking your projects to amazing places

Core Electronics is located in the heart of Newcastle, Australia. We're powered by makers, for makers. Drop by if you are looking for:

0:00 Intro
0:13 Video Overview
0:36 What You Need
1:40 Download the Scripts
2:03 Simple Hand Tracking Script
2:25 First Pay Off
2:40 Tracking More Hands
3:18 X-Y Data of a Single Point on Hand
3:48 Fingers Up or Down Script
4:29 Second Pay Off
5:16 Text to Speech Feature
5:43 GlowBit Matrix GPIO Control Script
6:10 Third Pay Off
6:20 GlowBit Script Explanation
8:53 Accessibility/Media Control Script
9:15 Final Pay Off
9:42 Macro and Script Explanation
12:15 Outro

The following trademarks are owned by Core Electronics Pty Ltd:
"Core Electronics" and the Core Electronics logo
"Makerverse" and the Makerverse logo
"PiicoDev" and the PiicoDev logo
"GlowBit" and the GlowBit logo
Рекомендации по теме
Комментарии
Автор

Finally, the tutorial I was most awaited for. Just love it

sumedh
Автор

Awsome dude, i just got my pi4 from you guys today.

TheLiquidMix
Автор

Keep it up man you are awsome and simple

asirisudarshana
Автор

This is absolutely bonkers!
The project is incredible, and I'm blown away by the quality of the tutorial. Keep up the awesome work man!

adamboden
Автор

I am amazed at all the effort that you must have put into these projects. Thank you so much.

jeffschroeder
Автор

So nice. Thank you so much. I love it very much, great tutorial!!! Much appreciated!

yingwaisia
Автор

Thank you for the wonderful project and tutorial, i just want to ask if how can access those specific joints in the hand so that I can compare my custom made specific hand gesture?

zgryx
Автор

I’m creating an invention and I really need the hand jesters so thanks 🙏 😊

AnthonielPinnock-cird
Автор

Keep it up, nice video clip, thank you for sharing it :)

Bianchi
Автор

Is this technology affected by high volume of UV light (ie light generated when arc welding) or would the camera require a lens to reduce light? Also could it track the light vs a reference point to determine how fast the light is moving linearly (2D and 3D)?

chapincougars
Автор

Would this model be a plug and play with a Coral USB Accelerator or is there other tasks when adding the Coral USB Accelerator?

ryandowney
Автор

is it possible to make this on Raspberry pi Zero 2 W?

elvinmirzezade
Автор

Damn, feels just like Tony Stark, awesome thx!

adrienguidat
Автор

thanks for the video but can you make pre installed opencv and tenserflow in .img fornat or os for rasberry pi

adityajadhav
Автор

👌👌excellent project, thanks for sharing

ajithsb
Автор

Hi, can this method be used in dark room where u will be on ur couch ?

alenninan
Автор

very nice.

You should give this a try with a Coral TPU ?
Which I'm pretty sure works nicely with a Pi and OpenCV.

nerdy_dav
Автор

That's clever.
I didn't realize a Pi had enough grunt to do live image recognition like that.

pileofstuff
Автор

interesting, I wonder how much better this can get with dual cameras on the pi5 now

leonoliveira
Автор

hi Tim, how can I add thumb tracking for the Are Fingers up or Down.py code? I've added the thumb ID (4) to the list, but not sure what I need to adjust afterwards. Can you please assist? Thanks in advance!

Redbeef