Derivation of Backpropagation in Neural Network

preview_player
Показать описание
Understand the Maths behind Backpropagation in Neural Networks. In this video, we will derive the equations for the Back Propagation in Neural Networks.

In this video, we are using using binary classification. In the next video, we will see the Backpropagation in Neural Networks with for mulit-class classification

This video, will be superhelpful for implementing Neural Networks in python.

Timestamps:
0:00 - Video Agenda
1:30 - Important Shapes and Equations
4:25 - Derivative of Loss function w.r.t W3
12:00 - Derivative of Loss function w.r.t W2
16:08 - Derivative of Loss function w.r.t W1
18:33 - Summary

➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖

This is Your Lane to Machine Learning ⭐

➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖

➖➖➖➖➖➖➖➖➖➖➖➖➖➖➖

Рекомендации по теме
Комментарии
Автор

If you found this video helpful, then hit the *_like_* button👍, and don't forget to *_subscribe_* ▶ to my channel as I upload a new Machine Learning Tutorial every week.

MachineLearningWithJay
Автор

I am mesmerized by the way you teach. I love this. I find learning NN difficult but after watching your videos and explanation, it seems easy to me. Thanks a lot. Please keep uploading your good work, no matter how many views your videos get. I believe, your videos would certainly be popular once people notice them.

mdtufajjalhossain
Автор

Seach a beautiful derivation with simple and minimal use of notations!!!

ranjeetkumbhar
Автор

Thanks a lot for This Amazing Introductory Lecture 😊
Lecture - 5 Completed from This Neural Network Playlist

PrithaMajumder
Автор

excellent work here! I was having trouble coming from single neuron in each layer where the matrix multiplication was not matching. Thanks for being so clear on how to manage the transpose on the weight function! that was exactly the missing info I needed!

kingcrimson
Автор

The best worked out example I've found. Good stuff.

vicdata
Автор

Ur video are so good I revisit them every month.

abhishekchaudhary
Автор

Thank you!! These are super informative.

vont
Автор

Loving this series. I could be mistaken, but I believe in the summary slide, the equations for operations 'dW2' and 'dW1' are incorrect, as the 'DZ2' operator should be multiplied with 'A1T' and the 'DZ1' operation should be multiplied with 'XT'. Didn't hinder at all, just wanted to point it out. Thanks so much again. 👍🏻👍🏻

kevinhoward
Автор

I love this video. Thank you so much for explaining such a complex concept so simply

simpleman
Автор

Thank you so much for the explanation bro. This video means a lot to me. 🙏🙏

abhishekfnu
Автор

great help. Thanks for all the videos. too good

garimakaushik
Автор

Very helpful, solved my little doubt
Keep it up bro

aniketjalsakare
Автор

thank you so much! very helpful! backpropagation was a hard nut to crack. NOT ANYMORE😛

GK-jwbn
Автор

Great Lecture. Clear and concise . Keep up good work

aloktrivedi
Автор

At 14:38 you say we can take the element wise multiplication, I see the dimensions are same but how did you know we have to take element wise multiplication or a matrix multiplication like we did previously for getting delw3. I am looking for an answer that does not just say that we need to match the dimensions that's why

vaibhavoutat
Автор

Thanks for the tutorial! Is there a theoretical back up for the part that you make up the correct dimension?

christopherwashington
Автор

Excellent clips..thanks.. I think there is error in last slide..dW2 and dW1 will depend on A1 and X respectively..

himtyagi
Автор

Could you explain why we take element wise multiplication at 14:46 and not regular matrix multiplication?

siddharthc
Автор

Can you please explain or share the documentation by taking cost function as mean squared error and perform for regression?

sivaprasad-clxp