Neural Networks Demystified [Part 3: Gradient Descent]

preview_player
Показать описание
Neural Networks Demystified
@stephencwelch

Supporting Code:

Link to Yann's Talk:

In this short series, we will build and train a complete Artificial Neural Network in python. New videos every other friday.

Part 1: Data + Architecture
Part 2: Forward Propagation
Part 3: Gradient Descent
Part 4: Backpropagation
Part 5: Numerical Gradient Checking
Part 6: Training
Part 7: Overfitting, Testing, and Regularization
Рекомендации по теме
Комментарии
Автор

bruh. this is better than ANYTHING on the internet made since. You did a great great job 9 years ago. This is literally so easy, simple and beautifully put together. The playlist needs to make a comeback and go viral.

sameenfatima
Автор

whenever I wanna refresh my understanding of neural networks i come back here and watch those videos all over again. thanks so much.

ahmedas
Автор

By far the best explanation for neural networks I have heard so far. We need more teachers like you. :)

kelkarmhr
Автор

You deserve an award for the clarity, depth, and pace, all of which means, with little effort any layman could understand what neural network real is. Thank you man!!!

KMG_Capital
Автор

Dude, your are simply awesome! I don't know since how long I was looking for something like this - short, simple & totally understandable. Kudos to your hard work.

im_tanmay_g
Автор

These are all absolutely amazing, Mr Welsh. You have the gift of clarity.Thank you.

kevinmcinerney
Автор

Dear Stephen, just started watching your videos about neural netwoks. They are just great! Please, keep doing more videos!

JoaoCavalcanti
Автор

I'm a totally noob in Neural Network, I came here just because I'm taking a deep learning course. This video is intuitive and awesome !

lmvejst
Автор

Omg, this is so beautifully understandable I'm going to cry. Saving my life for this neural computing exam...

sunilp
Автор

This was really lucid and simple to understand. Quite unlike the mathematically dense explanations that one is otherwise used to

nilaymehta
Автор

Done thanks
5:50 stochastic gradient descent vs batch gradient descent
Stochastic calculates the derivative of our cost function wrt weights for each input, batch calculates it for all inputs

mostinho
Автор

This has been so far the most clear explanation of neural network. Thank you.

AidanYi
Автор

Here in 2021 December, preparing for my exams watching these videos. Thanks

ninjaasmoke
Автор

Wow, this series deserves vastly more views!

ZardicharSC
Автор

These videos are genius. They really helped me visualize what a neural network does behind the scenes. Thank you!!!

joshbrenneman
Автор

I've been in a course of machine learning and neural networks for along 2 months and i had a lot of question until i found your videos, thank you so much

ivanhernandezaguilar
Автор

OK, this is the single most awesome tutorial on neural network!! Amazing job +Welch Labs !!
This shows your passion of teaching and also how clear your concepts are. You are an inspiration to teachers and professors. Love your method of instruction! I sincerely request you to upload more videos for other algorithms of machine learning. Thank you so much for these!

adamyatripathi
Автор

Really good and easy-to-understand demonstration.

yichen
Автор

Totally understood the topics better than when my professor taught ! Kudos!

himanshu
Автор

Beautifully Explained. We can imagine the hard work behind each video for such neat presentation.

nullindullghooomin