filmov
tv
Synthetic Gradients Tutorial - How to Speed Up Deep Learning Training
![preview_player](https://i.ytimg.com/vi/1z_Gv98-mkQ/maxresdefault.jpg)
Показать описание
Synthetic Gradients were introduced in 2016 by Max Jaderberg and other researchers at DeepMind. They are designed to replace backpropagation, and they can make all sorts of deep neural networks much faster to train, and even perform better. Moreover, they can allow Recurrent Neural Networks to learn long term patterns in the data.
Papers:
Blog posts:
Implementations:
Slides:
Papers:
Blog posts:
Implementations:
Slides: