Top Optimizers for Neural Networks

preview_player
Показать описание
In this video, I cover 16 of the most popular optimizers used for training neural networks, starting from the basic Gradient Descent (GD), to the most recent ones, such as Adam, AdamW, and Lookahead.

#deeplearning #artificialintelligence
#neuralnetworks #computerscience

~~~~~~~~~~~~~~~
References

Рекомендации по теме
Комментарии
Автор

This is fantastic! Thank you for taking the time to make this.

benemdav
Автор

AdamW and Yogi Optimizer could be considered Universal First order Optimizers. I have experimented on many multi-modal tasks and these two have consistently performed well.

karthikn
Автор

Thank you for this video! Excellent review

star-rooo
Автор

Thanks for the explanation I wanna see a benchmark for them as an extension for his video

SolathPrime
Автор

Great video❤
Can you send slides link pls

shredder-