Introduction to Numerical Optimization Gradient Descent - 1

preview_player
Показать описание
Lecture 20
Рекомендации по теме
Комментарии
Автор

Actually gradient points in the direction of the steepest ascent. We move on the opposite direction of that direction to minimize the cost function. That's why we put the that negative sign just before the learning rate in the parameter update step.

SoharabHossain
Автор

Great explanation of gradient descent. Watching this video was like the Aha! moment for me. Thank you!

rehanhaque
Автор

wahh kerenn pak jelasinnya, saya termotivasi untuk belajar lebih giat

fransiskusricardo
Автор

Dear Moderator, Can you please post the links of playlists when you post such beautiful videos, it's difficult to search amongst swarms of playlists for a single playlist.

tvsrr
Автор

Is there any playlist for it? It's so amazing

anamikabhowmick
Автор

Numerical Methods of Harris Hawks Optimization (HHO) Algorithm please !

-fatimazahra