10.18: Neural Networks: Backpropagation Part 5 - The Nature of Code

preview_player
Показать описание


References:

Videos:

Related Coding Challenges:

Timestamps:
0:00 Introduction
0:14 Fix some mixtakes
2:29 Add the deltas for the bias
3:54 Adjust the bias by its deltas
5:57 Stochastic gradient descent
7:26 Architecture for XOR problem
8:15 Training data
13:05 Randomize the training data
13:38 Start training
14:19 Outro

Editing by Mathieu Blanchette
Animations by Jason Heglund
Music from Epidemic Sound

#neuralnetwork #backpropagation #xor #javascript
Рекомендации по теме
Комментарии
Автор

As this is the last (numbered) episode on the Neural Networks series, let me use this place to thank you for your excellent teaching skills. I realize that everyone has a different preferred way of learning, but your approach fits my learning 'flavor' perfectly. It's a very practical approach as you build everything from scratch, yet you give the mathematical context at a level that I can handle. I have been randomly browsing the internet for info on Machine Learning and Neural Networks (which admittedly helped in understanding your videos) but your series brought everything home for me.
Thank you!

ErikBongers
Автор

Wow. I am so much excited . As if I coded it. Man, your energy is contagious 😁😁

rawbit
Автор

I just finished the whole seires, I want to tell you that you are amazing, I watched everything in 24 hours, once I started I couldnt stop watching, your are amazing!.
I think its the first time ever I emotionally conected with a youtube video.
keep the good job !

samuelkamhaji
Автор

Your NN series is just fantastic! I started watching it today and I couldn't stop until I actually got to see the end with a comprehensive understanding of the NN and the code. Thank you so much for sharing your knowledge with us!

vassisn
Автор

I am so glad I found you. I was struggling with python to implement neural networks. Then I saw you build ANN from scratch in JavaScript. Your videos motivated me to try the same on Excel VBA, which I'm familiar with.

I finally have done it on VBA. Many thanks to you for making such awesome videos and helping people like us!

mainakghosh
Автор

Okay, made it through the entire series. Code works, but I need to back propagate a few times before the neural network inside my head has leaned everything. Great series!

httn
Автор

Thank you so much!! This series was soooo useful while writing my NN trainer.
I'm currently working on an NNUE (Efficiently Updatable Neural Network) implementation for my Chess engine called Maxwell; the implementation of the network itself is trivially easy, but the training part was what I knew nothing about. This series was exactly what I needed!

eboatwright_
Автор

That was a crazy ride i saw all of your live streams you messed up a couple of times that made me wonder what is going on then thank to you, made some other videos explaining those messed up area clearly. And now it worked and the fact i actually understood what's going under the hood makes me so excited and happy that i want to cry.... and Your are the best teacher

bijoyroy
Автор

I just finished my version - 14:00 was exactly my reaction as well hahaha. The code is incredibly ugly but it feels so good to have made it this far. Gotta clean up the program tomorrow :D. Thanks for making these video series!

mumuhKUH
Автор

Your success made me in cry due to happiness and the pain you took to create these videos. I will now recommend your videos to other people !!

Ajay-zmngn
Автор

THANK YOU THANK YOU THANK YOU
i watched 3 blue 1 browns video on the topic
wasnt able to implement anything. I couldnt get over the matrix and the notation and what not

But after this series i understood the reason behind the maths
Finally my neural network is up and running

adityajain
Автор

2:10 no error shock, I can feel that happiness... 🤣🤣

imvickykumar
Автор

I am so happy you finally got it!!! This is a great series!

ioniniela
Автор

This series was great! I think working through a simple nn like this, doing all the matrix math by hand, is a very useful introduction before moving into something like tensorflow. Thanks! Also i can't wait for an updated NOC book in JS!

tyreldelaney
Автор

thank you, thank you, thank you for letting me see each and every step of this journey! 

i finally understand back propagation thanks to your notion of "spreading the blame" for the error values. it makes perfect sense to me now

i've implemented this in Tcl and seems to be working. soon i will do it in C for performance. my long term goal is to use this mechanism to generate some music that (hopefully) no one has ever heard before

thank you sir!

smokeyvw
Автор

This is an extremely insightful series, completely opens up the NN black box, including the hidden errors part, since no one really covers that bit. I would like to understand where the popular commercial APIs differ from this complete implementation in terms of its core logic

bgokhale
Автор

Wow! The feeling of having done this...simply awesome!

anuragghosh
Автор

you are an actual god, jesus, can't believe this series, it's perfect

santiagocalvo
Автор

This is by far your craziest coding challenge. It was a wild ride haha

FlareGunDebate
Автор

Thanks for the tutorial series, I've managed to write a C# neural network code thanks to your examples/explanations.

killeriuxs