Lecture 6 | Convergence, Loss Surfaces, and Optimization

preview_player
Показать описание
Carnegie Mellon University
Course: 11-785, Intro to Deep Learning
Offering: Fall 2019

Contents:
• Convergence in neural networks
• Rates of convergence
• Loss surfaces
• Learning rates, and optimization methods
• RMSProp, Adagrad, Momentum
Рекомендации по теме
Комментарии
Автор

If you are seeking for deep learning lectures from someone who truly understands the contents and makes connecting dots as well, here you go: Professor Raj is the best I've ever seen in deep learning. He is explaining the whole things with simple languages (very deep intuition).

mingx
Автор

Looks like this professor has become my hero for deep learning course. I started understanding his classes better than any one else's.

vipa
Автор

The first video u click, feeling encouraged and one of the first thing you hear from the Professor is the decline of student attending the class and the sinking feeling of hope, that it would be a miracle if there would be 1 student by the end of this course. Lol.
Still excellent content.

minatonamikazi
Автор

Kids run away 😂😂😂😂
That's what happens after neural networks
Back proporgation

Enem_Verse
Автор

this video didn't give theoretical guarantees for stochastic gradient descent

salmanfarhat