What is Gradient Descent?

preview_player
Показать описание

Credits:

Рекомендации по теме
Комментарии
Автор

This is a great visual way to explain such an important algorithm. It can be used for so many different applications. Looking forward to more advanced versions with more animations. I would love to also learn more about various approaches of picking the starting point.

rryk
Автор

wow i thought i was watching a 1m sub channel, amazing quality

dydx
Автор

Great Video. You will definitely grow your channel fast with this quality of video

jaceharrison
Автор

This was a great visual explination, thank you . cant wait to see your next video

havocthehobbit
Автор

Brilliant animation and very good explanation.. Keep up the great work..

AKfire
Автор

Great video, been subbing to a lot of smaller math channels

mrnarason
Автор

Great video. Even though I don't know much about this method (or neural networks in general) I happened to know that it might not always find optimal solution. I believe there is even a term for this. It would be great if you could tell us more about this and how people trying to solve this problem.

araqweyr
Автор

Yup there are limitations of GD, like finding saddle points or points with vanishing gradients, so can get stuck for a long time in those points, even converge but to local minima not the global minimum. Therefore, they created other variations like GD with momentum, NAG, RMSProp, Adam, variations of Adam, etc. Thanks for the video, I know all these concepts I'm just looking for a graph in 3D as most of the YT videos use the simple representation in 2D XD hahaha

giostechnologygiovannyv.ri
Автор

One drawback for gradient descent is local minimum. There are times when the steepest descent does not reach the actual minimum of the function

jackieliu
Автор

How can you perform gradient descent on an unknown "black box" function?

ja
Автор

Also, it would be great if you shared your code too..

AKfire