Synthetic Gradients Tutorial - How to Speed Up Deep Learning Training

preview_player
Показать описание
Synthetic Gradients were introduced in 2016 by Max Jaderberg and other researchers at DeepMind. They are designed to replace backpropagation, and they can make all sorts of deep neural networks much faster to train, and even perform better. Moreover, they can allow Recurrent Neural Networks to learn long term patterns in the data.

Papers:

Blog posts:

Implementations:

Slides:

Рекомендации по теме