CMU Neural Nets for NLP 2021 (4): Efficiency Tricks for Neural Nets

preview_player
Показать описание
This lecture (by Graham Neubig) for CMU CS 11-747, Neural Networks for NLP (Spring 2021) covers:

* Tips for Training on GPUs
* Parallel Training
* Negative Sampling
* Softmax Approximations: Negative Sampling, Hierarchical Softmax

Рекомендации по теме