Gradient descent, how neural networks learn | Chapter 2, Deep learning

preview_player
Показать описание
Enjoy these videos? Consider sharing one or two.

This video was supported by Amplify Partners.

To learn more, I highly recommend the book by Michael Nielsen
The book walks through the code behind the example in these videos, which you can find here:

MNIST database:

Also check out Chris Olah's blog:
His post on Neural networks and topology is particular beautiful, but honestly all of the stuff there is great.

And if you like that, you'll *love* the publications at distill:

For more videos, Welch Labs also has some great series on machine learning:

"But I've already voraciously consumed Nielsen's, Olah's and Welch's works", I hear you say. Well well, look at you then. That being the case, I might recommend that you continue on with the book "Deep Learning" by Goodfellow, Bengio, and Courville.

Music by Vincent Rubinetti:

Thanks to these viewers for their contributions to translations
Hebrew: Omer Tuchfeld
Italian: @teobucci

-------------------
Video timeline
0:00 - Introduction
0:30 - Recap
1:49 - Using training data
3:01 - Cost functions
6:55 - Gradient descent
11:18 - More on gradient vectors
12:19 - Gradient descent recap
13:01 - Analyzing the network
16:37 - Learning more
17:38 - Lisha Li interview
19:58 - Closing thoughts
------------------

3blue1brown is a channel about animating math, in all senses of the word animate. And you know the drill with YouTube, if you want to stay posted on new videos, subscribe, and click the bell to receive notifications (if you're into that).

Various social media stuffs:
Рекомендации по теме
Комментарии
Автор

Part 3 will be on backpropagation. I had originally planned to include it here, but the more I wanted to dig into a proper walk-through for what it's really doing, the more deserving it became of its own video. Stay tuned!

bluebrown
Автор

Dissapointed you did not animate a 13000-dimensional graph. Would make things easier

melkerper
Автор

I'm an IT student, and we have an Assignment on exactly this topic. We even have to use the MNIST data set. I have to say, this is absolutely lifesaving and I can not thank you enough Grant. What you do here is something that only a handful of people on this planet can do, explain and visualize rather complicated topics beautifully and simple. So from me and A LOT of students all around the globe, thank you so so much <3.

DubstepCherry
Автор

3:38 you missed the chance of using the meme *"AI: I've found an output, but at what cost?"*

JockyJazz
Автор

Unlike most teachers of subjects like this, this gentleman seems to be genuinely concerned that his audience understands him, and he makes a concerted and highly successful effort to convey the ideas in a cogent, digestible and stimulating form.

plekkchand
Автор

I'm only 12 minutes into this video right now, but I just wanted to say how much I appreciate the time and spacing you give to explaining a concept. You add pauses, you repeat things with slightly different wording, and you give examples and zoom in and out, linking to relevant thought processes that might help trigger an "a-ha" moment in the viewer. Many of these "hooks" actually make me understand concepts I've had trouble grasping in Maths, all because of your videos and the way you choose to explain things. So thanks! You're helping me a lot to become a smarter person. :)

Shrooblord
Автор

One of youtube's highest quality content channels! Chapeau

snookerbg
Автор

Not only the videos themselves are great on this channel but the lists of the supporting materials are amazing too! Drives me down a breathtaking rabbit hole every time! Thank you!

colonelmustard
Автор

I just sat through a 3 day ML accelerator class and you series did a far better job at explaining them with 4 twenty minute videos. Well done mate. Really appreciate it. Thank you

kraneclaims
Автор

"But we can do better! Growth mindset!" at 5:18 .... a wholesome intellectual i love to see it

hangilkim
Автор

I think what puts this material apart from the competition is the authors intuition of the focal points where the audience might loose the plot. Then he takes a patient and systematic turn to reiterate what have been learned so far to reinforce the basics to decrease the cognitive leap needed to grasp the next step. This ability is in my experience pretty unique.

dsmogor
Автор

i have no words to describe how thankful i am. thank you so much for such great content.

paulo
Автор

My math career is over. Once I learned about gradient descent, it was all downhill from there.

seCkielrd
Автор

You are changing the world, shaping humanity. I wish you and your team, happy and peaceful life. This is a noble profession, god bless you guys.

bikkikumarsha
Автор

Cant believe you explained this so easily. I thought it would take me ages to wrap my head around what neural networks basically are. This is truly amazing explanation!

imad_uddin
Автор

Rarely seen such well explained videos that break down a complex topic into components that are so easy to understand. Perfect speed and focus on core aspects. Plus summaries and references to previously discussed correlations. Very good examples and animations, which pick up the viewer without/with little knowledge where it has to be. THANK YOU

TheJeSuzZ
Автор

After watching your first video, I ended up drawing a "mock" neural network up on paper that would work on a 3x3 grid (after all what else are you supposed to do during a boring lecture class?). It was supposed to recognize boxes, x's, sevens, simple shapes, and I defined the 7 or so neurons that I thought it might need by hand. I did all the weighted sums and sigmoid functions on paper with calculator in hand. It took maybe an hour and a half to get everything straight but once I did, it worked. It guessed with fairly good accuracy that the little seven I "inputted" was a little seven. All that excitement because of your video.

Later that evening and the next one, I tried to program the same function taking PNGs as inputs and definitions of the neurons and it honestly was only a little more rewarding. But now that I see what the hidden neurons *actually* look like, I only want to learn so much more. I expected the patterns to be messy, but I was really surprised to see that it really does almost look like just noise.

Thank you for making these videos. I find myself suddenly motivated to go back to calculus class tomorrow and continue our less on gradients. There's just so much out there to learn and it's educators like you that are making it easier for curious individuals like me to get there.

Nyhilo
Автор

That end comment with Lisha Li really points out how important it is to put a lot of effort into gathering and creating good and structured data sets. I know it's cliché to state "garbage in, garbage out", but these findings put very precise context and weight to this particular issue.

Skydmig
Автор

Man, it feel so good to learn everything in zero shot now.. the neural networks, gradient decent, backpropagation . I used to get frustrated with lot of challenging concepts.. cuz I did not know maths, and AI terms.. but now after learning it for year it feels worth learning. Thanks to 3Blue guy.. whatever course he touched is worth all lectures combined i can't say. Its just pure core concept with animation. Quality at par

WiredWizardsRealm-etpp
Автор

Math courses in my college are basically trash compared to your videos, finally now I understand how math is being applied in computer science . Thank you so much for teaching in such an illustrative way .

darshita