CUDA Explained - Why Deep Learning uses GPUs

preview_player
Показать описание
💡Enroll to gain access to the full course:

Artificial intelligence with PyTorch and CUDA. Let's discuss how CUDA fits in with PyTorch, and more importantly, why we use GPUs in neural network programming.

🕒🦎 VIDEO SECTIONS 🦎🕒

00:30 Help deeplizard add video timestamps - See example in the description
13:03 Collective Intelligence and the DEEPLIZARD HIVEMIND

💥🦎 DEEPLIZARD COMMUNITY RESOURCES 🦎💥

👋 Hey, we're Chris and Mandy, the creators of deeplizard!

👉 Check out the website for more learning material:

💻 ENROLL TO GET DOWNLOAD ACCESS TO CODE FILES

🧠 Support collective intelligence, join the deeplizard hivemind:

🧠 Use code DEEPLIZARD at checkout to receive 15% off your first Neurohacker order
👉 Use your receipt from Neurohacker to get a discount on deeplizard courses

👀 CHECK OUT OUR VLOG:

❤️🦎 Special thanks to the following polymaths of the deeplizard hivemind:
Tammy
Mano Prime
Ling Li

🚀 Boost collective intelligence by sharing this video on social media!

👀 Follow deeplizard:

🎓 Deep Learning with deeplizard:

🎓 Other Courses:

🛒 Check out products deeplizard recommends on Amazon:

🎵 deeplizard uses music by Kevin MacLeod

❤️ Please use the knowledge gained from deeplizard content for good, not evil.
Рекомендации по теме
Комментарии
Автор

Check out the corresponding blog and other resources for this video at:

deeplizard
Автор

Great job speeding Jensen Huang up. xD

debajyotisg
Автор

Beautifully done, Chris. Wow. Thanks. I learned a lot.

Sikuq
Автор

wow, I saw these first 4 videos of the Pytorch series and am impressed how much time & effort you put into these tutorials. Thanks a lot.
Also, you have developed enormously (although the older tutorials were already very good)

gabriellugmayr
Автор

This channel seriously deserves million subs.Have been watching many series from this channel.Great work keep going I'm sure this channel gonna flow with lots of subscribers someday .

majeedhussain
Автор

Amazing video and loved the short clips! Thank you!

engkamyabi
Автор

Congratulations, You've impressed me. Very professional series. Right to the good stuff, clear and sharp voice, broad yet specific explanations.

Xiler
Автор

Rich informative video !! No better explanation is more than yours!!

Aweheid
Автор

5:13 ballmer ambush, panic clicking to skip (thank you for awesome video)

larryteslaspacexboringlawr
Автор

This channel should have more subscribers, seriously

ajwadakil
Автор

Thank you very much, currently learning deep learning and this was perfect to explain why I need a good GPU

MarcelloNesca
Автор

Nice video, I like all the graphics you used. Where do you find them?

Notrious
Автор

Good overview, also having 8 cores won't necessarily speed up computation by exactly x8, perhaps by x7 in practice

I just wish you would mention that processors use SSE2, AVX2 and similar things that allow each core do 8 at a time, rather than one by one. This allows a CPU registers to process arrays in chunks of 8. Many C/C++ programmers don't know about those, and build programs that are by default doomed to underperform.
So I feel everyone is always unfair towards the CPU. Everybody is pointing at the cores, but each core can (and should) use intrinsics, doing parallel things.
Especially with various RNNs where we only win if we move the entire algorithm to the GPU to avoid data transfer bottlenecks, and when the RNN is decently wide in each layer for the GPU.

Also, CPU is really flexible when it comes to 'if/else' or while loops, reacting faster and wiry when the branch occurs.

IgorAherne
Автор

Thanks for putting in all the efforts.

mayur
Автор

I ended up here because my daughter is learning "AI" at high school and now I need to understand how this all works to make her a PC.

estebansevilla
Автор

I'd like to know whether you could use dedicated graphic card for deep learning while you don't have a CPU with no iGPU.
This will help me much with my screen cable management issue (I'm new to this) !
Thanks!

jackt
Автор

So which graphics card should I buy for deep learning?

mexzarkashiev
Автор

Very professional video. Good information.

taj-ulislam
Автор

Any idea on how the rtx graphics cards and their tensor core stuff compares to the standard gtx gpus? Is that something that tensorflow or pytorch take advantage of?

Ammothief
Автор

Did they remove all those stats/functions on a newer version of Cudo? Because I just recently downloaded it, and the only thing I can see on the screen is CPU, XMRig, h/s to the left and Payout Coin to the right. That's it! I'm using CPU but want to use GPU, but can't see any option. Please help if you can.

True