Gradient Descent Explained

preview_player
Показать описание

Gradient descent is an optimization algorithm which is commonly-used to train machine learning models and neural networks. Training data helps these models learn over time, and the cost function within gradient descent specifically acts as a barometer, gauging its accuracy with each iteration of parameter updates. IBM Master Inventor Martin Keen explains.

#AI #Software #ITModernization #GradientDescent #lightboard #IBM #DataFabric #WatsonX
Рекомендации по теме
Комментарии
Автор

Very nice explanation of the concept, brief and understandable. Awesome!

Msyoutube
Автор

The most confusing part of this video is how he managed to write everything backwards on the glass so flawlessly

davidrempel
Автор

Good explanation. It is somewhat also important to note that curve should be differentiable.

krishnakeshav
Автор

Thank You Martin, really helpful for my uni exam

Shrimant-ubul
Автор

Im always confused by these screens or boards, whatever.
Like how do you write on them? Do you have to write backwards or do you write normally and it kinda mirrors it?

harshsonar
Автор

didn't know Steve Kerr works at IBM

sotirismoschos
Автор

Very good explanation of high-level concept on GD.

hugaexplit
Автор

The best video i could find. Thank you.

krissatish
Автор

Nice I learned more from this 7 min video than 1 hour long boring lecture

_alekss
Автор

ibm: "how to make a neural network for the stock market?"

velo
Автор

I was expecting a mathematical explanation :(

Rajivrocks-Ltd.
Автор

I couldn't visualise, I saw nothing on the screen...

abdulhamidabdullahimagama