Synthetic Gradients – Decoupling Layers of a Neural Nets: Anuj Gupta

preview_player
Показать описание
Once in a while comes a (crazy!) idea that can change the very fundamentals of an area. In this talk, we will see one such idea that can change how neural networks are trained.

As of now Back propagation algorithm is at the heart of training any neural net. However, the algorithm suffers from certain drawbacks which force layers of the neural net to be trained strictly in a sequential manner. In this talk we see a very powerful technique to break free from this severe limitation.

Рекомендации по теме
Комментарии
Автор

synthetic gradients are used also in our brain for the reward mechanism

Ymet_news
Автор

Interesting topic definitely.... but I really want to know the time it took to train the network.

Also one thing to keep in mind is we need to not only train the main network but also we need to tarain the small ones as well

machineeducation