[Neural Network 7] Backpropagation Demystified: A Step-by-Step Guide to the Heart of Neural Networks

preview_player
Показать описание
Erratum
3/5/2024
14:53, delC/delw5 = -0.1 (not -0.01), so the new *w5 is 0.56.
Also, there should be modified, at 19:20, z3=0.6156x0.56+0.5819x0.4595=0.6121, and o1 with sigmoid calculation is 0.64842. I apologize the basic mistake!
-----------------------------------------------------------------------
Welcome to EZ LearnAI! Today, we're unraveling the mysteries of neural networks with a hands-on guide to the Backpropagation Algorithm. We'll break down this cornerstone of AI learning through a clear, numerical example. Whether you're a beginner or a seasoned learner, get ready for an insightful journey into the world of artificial intelligence. Let's dive into backpropagation together!

#Backpropagation #NeuralNetworks #MachineLearning #AIExplained #DeepLearning #AIeducation #TechTutorial #DataScience #ArtificialIntelligence #LearningAlgorithms

🎵 Song: 'AERØHEAD - Fragments' is under a Free for YouTube license.
🎶 Music promoted by BreakingCopyright:
Рекомендации по теме
Комментарии
Автор

Oh my God! Can't believe this simple video actually helped me to understand the Math behind Backpropagation! I've spent a week trying to understand the Math, the How/WHY the Output layer is ACTUALLY related to the Hidden layers in terms of derivative, and went crazy with ChatGpt and whole bunch of other overly complexed videos that just missed the point for beginners. Thank you so much!

tonyh
Автор

The only viode which I found explains the sequence of weight updates clearly 😇

alannesta
Автор

Wow! Usually thumbs down AI generated voice overs, but this was exceptional, thank you... The chain rule explained with animal speeds was exceptionally creative....THUMBS UP!

veganath
Автор

Thank you for this. I know calc and this is the only vid that I’ve found that truly breaks the various derivatives apart and explains where each component of dC/dw value comes from, and also how to use that value once calculated. Only thing I want for after this vid is how to adjust biases when doing backpropagation.

zenmikey
Автор

Slow and steady explanation at 10 mins 👏

shreenivasn
Автор

Superb Explanation! Cannot find any video that easy to understand backpropagation hats off to EZlearn AI 😎

sushantgarudkar
Автор

One of the best intuitive and simple explanations on Youtube right now

Jack-cmch
Автор

Superbly done! One of the best tutorials I've seen thank you

SassePhoto
Автор

Absolutely brilliant and extremely well done and thanks a million. It seems you fully understand and it shows in your voice

PaulBrassington_flutter_expert
Автор

Thank you for your video's. They really help understanding the thematics of Machine Leaning and AI. This explanation of backpropagation was the only one giving me a good visual step by step guide on the mathematics behind this. Really helpful for my test next week <3

pdyvkqk
Автор

Thank you so much, finally understand backpropagation, but i want to ask how to optimize the value if there are biases?

cqziemh
Автор

Thank you for breaking it down weight update by weight update!

seaofgeese
Автор

Thanks, this is a great description of the back propagation processes.

rogerzimmerman
Автор

Hi.., Thanks for the simple and detailed explanation. One doubt, While calculating the weight updation for w1, if we consider the addition component wrt to the other path containing w6 and w2, will it be wrong

masoodamodak
Автор

To update one parameter at 2:06 why multiplying with 60000

jairam
Автор

9:07, aren't w5 and w6 the weights of the third layer?

rede_neural
Автор

plz don't stop to make these videos

bhushan
Автор

If we had more than one training example, and in the forward pass, we compute the activation of output layer for both examples and find the cost. 

Here, we would have two values for all variables (z and its activation in each layer) let's say we are finding dc/dw5 like at 8:01 and dc/dw5 = h1 i.e the activation of the first neuron in first hidden layer. 

Now, we would have two values for h1 (one for each training example in the forward pass). Which one of the two values should we use for finding the gradient of C wrt w5? h1 value computed with the first training example in the forward pass or h1 value computed with the second training example in the forward pass? or should we have compute the average of both?

Kindly answer this question, no where on the internet can I find the answer for this question. They just blindly telling the big picture of gradient descent etc. I want exactly how it's done for a simple example like this one before building neural networks..

the_random_noob
Автор

Can you modify the indexing of this playlist?
As it should be the opposite of what it is right now

tffksdk