Ada Grad and Ada Delta Optimizer || Lesson 14 || Deep Learning || Learning Monkey ||

preview_player
Показать описание
#deeplearning#neuralnetwork#learningmonkey

In this class, we discuss ADA grad and ADA delta optimizer.
As we have a momentum term.
This momentum term helps in moving the weights towards the minimum point.
As we approach the minimum point we have to slow down the speed.
We need an alpha value that is controlled.
We change the alpha value ie we decrease the value as it is moving towards the minimum point.
This helps in moving near to the minimum point.

Link for playlists:

Рекомендации по теме
Комментарии
Автор

AdaDelta is not what you are telling, it's RMSprop - AdaDelta don't even have learning rate.

yatinarora