What is backpropagation really doing? | Chapter 3, Deep learning

preview_player
Показать описание
What's actually happening to a neural network as it learns?
An equally valuable form of support is to simply share some of the videos.

The following video is sort of an appendix to this one. The main goal with the follow-on video is to show the connection between the visual walkthrough here, and the representation of these "nudges" in terms of partial derivatives that you will find when reading about backpropagation in other resources, like Michael Nielsen's book or Chis Olah's blog.

Video timeline:
0:00 - Introduction
0:23 - Recap
3:07 - Intuitive walkthrough example
9:33 - Stochastic gradient descent
12:28 - Final words

Thanks to these viewers for their contributions to translations
Italian: @teobucci
Vietnamese: CyanGuy111
Рекомендации по теме
Комментарии
Автор

You might think that your videos are fodder for university students boning up on a subject, or mathematicians/engineers in the early stages of their careers - basically that you cater to a younger audience. I'll have to prove you wrong. I'm in my early 60's and have been involved with information technology in some form or fashion my entire career. I enjoy learning; always have. I've viewed many of your videos only because they interest me and have you subscribed on my Youtube account so as to get notifications of updates. I find the topics about which you speak fascinating and am a bit jealous of those university grads today who now have access to this material at their fingertips. I wish with all my heart that I was able access these videos back when I was in university. It would have made life SOOO much easier for me back then. Your pedagogic skills are astounding, demonstrated by your ability to communicate difficult subjects precisely, concisely and simply. The animation format is integral with the presentation, adding to the delivery of the material. I salute you!! Please keep these videos coming.

rbm
Автор

I can't claim to have understood everything from the first watch-through of this series, and I will watch these videos again with pen and paper in hand, but even this first viewing has made neural networks go from pure witchcraft and wizardry to something that actually makes sense in my head.

I can't possibly thank you enough for posting these videos.

Niki_
Автор

This series is totally brilliant. I am 73 years old and used to teach mathematics. I am still learning stuff and with the help of sites like yours it makes it so much easier. Have you thought of doing any videos on the really complex subject of real analysis/. Keep up the good work. Kevin Connolly

kevinconnolly
Автор

Anyone else smiling through all of his videos because you're understanding so much so well like never before? <3

ThePRASANTHof
Автор

These visualizations are spot-on. Only a few people in the entire world need to make a great explanations backed by powerful visualizations about a topic - the rest of the world just needs to discover these. So much time wasted by learners trying to locate easily-digestible information, among all the inferior presentation methods out there. Glad to have found one of the best for this topic.

TopGunMan
Автор

Great explanation, your team is awesome. "A drunk man stumbling aimlessly downhill, but taking quick steps" is the best analogy ever for Stochastic gradient descent. :-)

srirams
Автор

I find it astonishing how well you convey some of the intricacies here, way better than most of ML practitioners who teach the public, whether as youtubers, online instructors or public speakers who end up on video on the internet. I very much resonate with the way you frame things, the metaphors you choose, your visualizations and of course your evident love for understanding and sharing thereof. Your work is a great gift to all of us - students, engineers, researchers, philosophers, random viewers from all walks of life. Thank you.

dinub
Автор

I must say it. I am 100% seriously learnt English just to be able watch your videos. This is the content that will help everyone wandering grow stronger in their favourite subjects. I thank you for your work from the bottom of my heart ❤️❤️

kemsekov
Автор

First time I caught myself having a moment of awe while watching educational content. The production value is incredibly high... the way the connections twinkle and move to represent adjusting the weights, the small animations, the descent into various shapes, how the little arrows indicating the desired change move and change size. Beautifully put together! Thanks a lot!

MySkittlesRainbow
Автор

I've watched these videos 3 times and everytime I watch them, I feel a bit smarter. It starts with understanding little to progressively understanding more and more and finally seeing the big picture. Can I say I've understood everything? No. I am getting there and I'll be coming back for these again. For anyone who's feeling discouraged, I can assure you that you'll get there.
Thank you so much for creating quality content that brings the driest, most theoretical concepts to life! You're a hero.

AishwaryaAR
Автор

No form of words can express enough what magic you're creating! I don't know how much you actually think you impact us.. but let me tell you Grant, your effect on my life is immeasurable! And the fact that you learnt it the hard way, and made it so simple for us, so that we don't have to go through the same, makes me respect you even more and more every single day.. Thank you so much.. 3B1B is undoubtedly the best channel on YouTube..

shiladitya
Автор

I just want to say that I love that you're moving into the mathematics of ML. The visualizations convey the concepts so well!

makebreakrepeat
Автор

Note to myself:
Aditya, if you're having trouble understanding, read this.

Scroll to 06:17. Listen to what he's saying, "In a sense, the neurons that are firing while seeing a 2, get more strongly linked to those firing when thinking about a two." Now, pause the video and listen.

All this is is just a fancy way of saying: when we show our model a picture of a handwritten 2 and we tell it, "hey listen up this thing is a 2" (like in our training set we have labels for our input pictures) and then we find out the activation units that have a say in influencing the hypothesis value of the 2 label, ie, the activation units which can heavily increase or decrease the value of the output unit for the label 2, we tell those activation units "hey guys when you see something that resembles this thing, fire up the 2 label, ie increase the hypothesis value of the label for 2. Basically, we assign those activation units greater weights (or parameters) which influence the hypothesis value of the 2 label more (so that they can actually give us the right answer and say "oh look this is probably a two".

I hope this helped and didn't complicate it further. If you don't get it, go over the video a few more times and review Andrew Ng's notes on this in the ML course on the Backpropagation lecture in week 5. Cheers, bro. Love you.

adityashukzy
Автор

Let's just say a quick thank you to all the mathematicians who wrote the highly optimized software libraries that do all this math for us so ordinary programmers can quickly get neural nets up and running on our training data.

fakecubed
Автор

Man, the clarity and the animations make your videos masterpieces

mark.fedorov
Автор

All of the videos shown on this channel have been written so well. Even I can understand, mathematics was a complete mystery for me in school. I have a sixth grade education and I feel really smart after watching one of these programs. Thanks so much. I'm really grateful that you have taken the time to educate the people that have a hard time understanding but still want to learn. What you're doing is just as important as any other volunteer or charity work. I'm excited I found this .

donniegoodman
Автор

I have just found your videos. I have enjoyed watching science educators on youtube for many years, but these videos are the best examples of complex ideas being explained clearly whilst still being entertaining. I can't believe I had never heard of this channel before.

SamuelJFord
Автор

Thank you for what you are doing. I have just started learning ML and I find your video really helpful. Thank you again from bottom of my heart for everything you are doing. I just wanted to let you know that you are really making changes and inspiring youngsters like me.

bikrammajhi
Автор

You are a wizard when it comes to animations and understanding

cameronadams
Автор

I've been reading the book for quite some time but your explanations using animations have pushed my understanding to new levels. So many thanks to you.

PS: Keep the background music. Holds our concentration for a long time.

gmish
join shbcf.ru