How Neural Networks Work - Training - Part 4

preview_player
Показать описание
In part 4 of the Neural Networking series I walk through a more realistic example of training a neural network. This video will build upon the previous videos in the series that cover weights, bias & activations. This sets up the next video which will cover updating weights with back propagation.
Рекомендации по теме
Комментарии
Автор

Where is part 5??? I really want to know how to adjust weights

tvgk
Автор

Nooo where's the next video?! This was so helpful

careyannehowlett
Автор

I apologize for the audio quality. You may have to turn up the volume on your speakers to hear properly.

JamesOliver
Автор

James, I don't see part 5 back propagation video

SriKanth-mzhr
Автор

Please upload a part 5 (updating weights with back propagation.)

yassinekharrat
Автор

after 3 weeks of wondering, now find best way of explain, now understood Thanks

muhammadayoub
Автор

Where is the back probagation video sir?

I really appreciate your tutorials and explanation very useful and saved my time.
waiting for back propagation video

taimoorneutron
Автор

Man! Immensely helpful. Thank You James

shalindeval
Автор

Can someone please explain on how the final value for A3 is 0.67 I’m getting 0.80

Praven
Автор

Great video!! But might you please explain more about how many number of neutrons in Hidden Layer that we should choose (2 in this case)? Thank you

TrungNguyen-oypy
Автор

Im a bit confused, how did you get an activation value of 0, 67 in the last neuron? I got 0, 80 in my program, I also rechecked with an calculator for Sigmoid function. The question is if I did miss something or.. But thanks lot for these videos I learned the basics of neural networks and also I made my first C++ program for training data.

RudeRud
Автор

I still don't understand why if all x1, x2 and x3 are connected to all two neurons from the hidden layer, the weights of connections red and blue are not converging to the same values? Wouldn't that happen that with enough data? Although you started with random weights, if the function is converging I think w1 will be almost equal to w4, w2~w5 and w3~w6. Am I wrong?

alegrego
Автор

James... One humble request., kindly make a video on data blending on two reports.

MrYeduguri
Автор

I am a medical student from Germany and writing my doctoral thesis in radiology on how we can use deep learning to recognise some pathologies. Did you choose 1 as your Bias in both your steps just by chance or they should be the same in every step? Can weight also be negative? Is that also chosen randomly? Thank you so much for your videos. I have watched probably 10 different videos about this topic and yours are the only ones I didn't cut in the middle saying "wait, I don't get that!" Stay healthy!

neslihangroll
Автор

Video did not cover any knowledge regarding training phase as stated in title :( Kinda disappointing.

andrzej_k