filmov
tv
Hand Tracking & Gesture Control With Raspberry Pi + OpenCV + Python
Показать описание
Identify and track every joint in the fingers of your human hands, live. Then use your human hands to send commands to control media software and GPIO attached hardware. All via a Raspberry Pi Single Board Computer.
Make sure to use the Previous Raspberry Pi 'Buster' OS with this Guide.
Related Information
Machine and deep learning has never been more accessible as this video will demonstrate. Cameras in combination with machine learning create the most powerful sensor you can ever put on a Raspberry Pi Single Board Computer. Today is all about real-time Hand Recognition and Finger Identification via computer vision with our Raspberry Pi single board computer doing all the hard work. The system built here will use Open-CV particularly CVzone. This is a huge package that helps solve real-time computer vision and image processing problems. This system will also be using MediaPipe for real-time Hand Identification, which will run a TensorFlow Lite delegate during script operation for hardware acceleration (this guide has it all!). Check the full guide on how to install these correctly and download the scripts. There are other types of gesture recognition technology that will work with a Raspberry Pi 4 Model B. For instance, you can also do hand identification or gesture identification with Pytorch, Haar cascades, or YOLO/YOLOv2 Packages but the MediaPipe dataset and system used in this guide is far superior. The first script when run will identify any hands seen in front of it through computer vision and then use machine learning to draw a hand framework over the top of any hands identified. The second script will output to the shell a statement on total finger count (both up and down) and specific details of each Finger on whether it is up or down. Third and fourth scripts are all about controlling hardware and software with your hands. The first uses a GlowBit Matrix 4x4. The amount of fingers you show will produce different colours on the matrix. The final script lets you control a VLC media player (play, pause, volume control) all through your fingertips. Gesture Volume Control success! All the scripts are fully open-source and can readily be expanded taking your projects to amazing places
Core Electronics is located in the heart of Newcastle, Australia. We're powered by makers, for makers. Drop by if you are looking for:
0:00 Intro
0:13 Video Overview
0:36 What You Need
1:40 Download the Scripts
2:03 Simple Hand Tracking Script
2:25 First Pay Off
2:40 Tracking More Hands
3:18 X-Y Data of a Single Point on Hand
3:48 Fingers Up or Down Script
4:29 Second Pay Off
5:16 Text to Speech Feature
5:43 GlowBit Matrix GPIO Control Script
6:10 Third Pay Off
6:20 GlowBit Script Explanation
8:53 Accessibility/Media Control Script
9:15 Final Pay Off
9:42 Macro and Script Explanation
12:15 Outro
The following trademarks are owned by Core Electronics Pty Ltd:
"Core Electronics" and the Core Electronics logo
"Makerverse" and the Makerverse logo
"PiicoDev" and the PiicoDev logo
"GlowBit" and the GlowBit logo
Make sure to use the Previous Raspberry Pi 'Buster' OS with this Guide.
Related Information
Machine and deep learning has never been more accessible as this video will demonstrate. Cameras in combination with machine learning create the most powerful sensor you can ever put on a Raspberry Pi Single Board Computer. Today is all about real-time Hand Recognition and Finger Identification via computer vision with our Raspberry Pi single board computer doing all the hard work. The system built here will use Open-CV particularly CVzone. This is a huge package that helps solve real-time computer vision and image processing problems. This system will also be using MediaPipe for real-time Hand Identification, which will run a TensorFlow Lite delegate during script operation for hardware acceleration (this guide has it all!). Check the full guide on how to install these correctly and download the scripts. There are other types of gesture recognition technology that will work with a Raspberry Pi 4 Model B. For instance, you can also do hand identification or gesture identification with Pytorch, Haar cascades, or YOLO/YOLOv2 Packages but the MediaPipe dataset and system used in this guide is far superior. The first script when run will identify any hands seen in front of it through computer vision and then use machine learning to draw a hand framework over the top of any hands identified. The second script will output to the shell a statement on total finger count (both up and down) and specific details of each Finger on whether it is up or down. Third and fourth scripts are all about controlling hardware and software with your hands. The first uses a GlowBit Matrix 4x4. The amount of fingers you show will produce different colours on the matrix. The final script lets you control a VLC media player (play, pause, volume control) all through your fingertips. Gesture Volume Control success! All the scripts are fully open-source and can readily be expanded taking your projects to amazing places
Core Electronics is located in the heart of Newcastle, Australia. We're powered by makers, for makers. Drop by if you are looking for:
0:00 Intro
0:13 Video Overview
0:36 What You Need
1:40 Download the Scripts
2:03 Simple Hand Tracking Script
2:25 First Pay Off
2:40 Tracking More Hands
3:18 X-Y Data of a Single Point on Hand
3:48 Fingers Up or Down Script
4:29 Second Pay Off
5:16 Text to Speech Feature
5:43 GlowBit Matrix GPIO Control Script
6:10 Third Pay Off
6:20 GlowBit Script Explanation
8:53 Accessibility/Media Control Script
9:15 Final Pay Off
9:42 Macro and Script Explanation
12:15 Outro
The following trademarks are owned by Core Electronics Pty Ltd:
"Core Electronics" and the Core Electronics logo
"Makerverse" and the Makerverse logo
"PiicoDev" and the PiicoDev logo
"GlowBit" and the GlowBit logo
Комментарии