Conceptual Overview of Backpropagation Algorithm without Calculus

preview_player
Показать описание
In this video we present the ideas involved in the backpropagation algorithm used to train modern day neural networks. We avoid calculus and therefore our presentation is mathematically informal. We present an intuitive notion of the derivative and move to the chain rule and sum rule which form the basis of the backprop algorithm.

Buy me a beer by clicking "Support" on my youtube channel page!
Рекомендации по теме
Комментарии
Автор

This video is awesome! I wish more people would make videos like this. Keep up the good work!

franklinwang
Автор

Thankyou. I might try it that way sometime

ThatHippyPerson
Автор

This was very helpful. Thank you! Best explanation yet.

SageBetko
Автор

thanks for the videos, it is brilliant

jiashenglai
Автор

I wish I had watched this before all the other vids full of calculus which I don't care about

spb
Автор

0.75 x will get him to talk at normal speed

joelvaz
Автор

Im still lost, , but closer then most videos have got me... If I have a 3 input 4 hidden and 1 output NN then in order to find how much I need to change w1 If anyone is watching this and understands can you help... I just need for the first weight I can take it form their... H1 would equal (i1+w1)+(i2*w2)+(i3*w3) I get that and H1* W4 ect... How much do I change W1 and W4 the top two weights between my input and output. Error is (expected - results) ...Do I multiply this by the weight to get the amount of change or what. Im seriously confused.

seditt
Автор

you increased play speed? sound does not sound well. makes me nauseous.

_bobbejaan
Автор

why do you speak so fast bro... thanks for the video btw

seyedalirezagolestaneh
join shbcf.ru