VIM3 - Realtime Object Detection Using Yolo v3

preview_player
Показать описание
In this video, You Jun from Khadas Team demonstrates the VIM3's 5.0 TOPS (tera operations per second) NPU (neural processing unit), using the Yolo v3 object detection model! Watch the video to find out how to replicate this result on your own VIM3.

Many thanks to our developers who spent many tireless hours photographing, cleaning, tagging, and training the A.I model. Due to limited image data, it isn't perfect, but a good demonstration of what can be achieved with the Khadas VIM3's Amlogic A311D SoC.

In our next video, we will go into more depth as to how you can train your own A.I object detection models.

More information:

1. Download *this* NPU demo for your VIM3

2. Apply for VIM3 NPU Toolkit:

3. How to use the NPU:

Required items for this build:

1. "VIM3":

2. "VIM3L":

3. "USB Webcam"
* fast auto-focus / fixed focal length is preferred

4. "MIPI-CSI Camera for VIM3"

5. "Training PC"
Nvidia 1080ti
Intel Core i7 9800x

GPU Training:
1. Minimum graphics card requirement 8G (also known as GTX1080)
2. CPU, at least i7 9700

CPU Training (not recommended):
1. RAM can't be less than 16GB
2. At least 8 CPU cores

Credits:

1. Music:
YouTube Free Music Library, Digital Secrets

2. Videography:

Purchase a VIM3:
Рекомендации по теме
Комментарии
Автор

Great video -- really interesting demo. :)

ExplainingComputers
Автор

It's a very interesting video. If I was watched it some days ago I would have bought a Khadas board instead of a Raspberry Pi 4.

hologramh
Автор

Three years since this video was posted and there are no clear documents to explain how to use the NPU in opencv or tensforflow. Before getting more into that, the claim that these boards can work with USB camera is not entirely true. Yes it may work with most cameras but not all. I couldn’t get the board to work with PS3 eye cam. The same camera works well with other boards such as Raspberry PI 3B. May be the OS is missing the required driver modules.

After failing to get the board to work with PS3 eye camera, I simply used the old raspberry pi to stream video from the camera. With that I was able to read the video stream via IP and do things like face detection. But only using the CPU at 15 FPS at best.

I could not get the CPP/python demos to use the NPU. Then I tried to get the CPP demos to read from the IP stream without much luck.

On the whole, even after three years this product and the support is not mature enough for anyone to take it seriously

Mirchitunes
Автор

Great video. I'll also soon make a video about it. I now also have the NVIDIA Jetson Nano. It will be a nice to compare performance of both. Thank you.

NicoDsSBCs
Автор

Khadas products looks interesting. Could I use the Vim 3 on my early 90s model vehicle? Id like to use the vim 3 or edge which is built for industrial purposes to mod and update my Pickup. The Vim 3 is good for a rear and front view collision camera system and a Holographic HUD. The Edge is great for making a Digital, and Virtual Dashboard with telemetry, Audio control system, Passenger Entertainment Console, GPS & WIFI, A/C HVAC controls.
The Mods are endless!

joelg
Автор

as somebody who has not been too much into AI and ML yet, I'm confused with the terms. Isnt this demo more about Machine Learning? Training an algorithm to recognize predefined objects sounds like ML to me, AI on the other hand, I'd expect this to be the next stage... like scanning the environment, matching it to Google Reverse Image Search e.g. and then learning thats a Rubics Cube.

HXKM
Автор

What about camera support? Are there any non khadas cameras which can work on vim3 ? Like raspberry pi cameras ??

bachshukla
Автор

The board is powerful but the software ecosystem is extremely weak compared to Jetson. Perhaps its for the hobbyist like Raspberry but not for practical applications where Jetson specializes

Zeesh
Автор

Once the board is "trained" what are some of the practical applications that can be implemented. I'm wondering what can/how can applications be implemented in the DIYer's space?




It looks like there are some exciting possibilities.

randallnelson
Автор

Hi what video camera are you using to shoot 240 fps at 4K?

williamlee
Автор

Is there any way to get products for review? I want to make good tutorial videos for my subscribers.

LifeCoding
Автор

Can I run windows python TensorFlow from this?

naasvanrooyen
Автор

Hi! What about standart Tiny YOLO model? may you do the video file with comand line after you run the project, and open file tiny-yolo.cfg what you use?
I now use Jetson nano and TX2 in my projects, in this video I see really interesting board. But before I spend my time and money on it, I would like to see how it works with the same datset.

drevocd
Автор

can you please provide a video on how to run our own model on NPU Vim3, I am trying to run tensorflow, and .tflite models on this NPU. Need urgent help! Please.

dhruvgaba
Автор

Thanks. What is the video size /resolution and the fps attained?

anpr
Автор

Does it is possible to have that via RTX2070?

fabrizio-
Автор

It needs support more yolov4 and v5 is out

zultandimitry