Intro to Deep Learning -- L14 Intro to Recurrent Neural Networks [Stat453, SS20]

preview_player
Показать описание

Covers some of the basics of recurrent neural networks. In particular, this lecture covers

RNNs and Sequence Modeling Tasks: 00:00
Backpropagation Through Time: 20:23
Long-short term memory (LSTM): 31:42
Many-to-one Word RNNs: 45:16
Generating Text with Character RNNs: 50:45
Attention Mechanisms and Transformers: 1:00:09
Рекомендации по теме
Комментарии
Автор

It seems that you have the same jupyter notebook for both of the following?

RNN with GRU cells (IMDB)
[PyTorch: GitHub | Nbviewer]
Multilayer bi-directional RNN (IMDB)
[PyTorch: GitHub | Nbviewer]

gireeshbogu
Автор

RNNs and Sequence Modeling Tasks- 00:00
Backpropagation Through Time:-20:23
Long-short term memory (LSTM)- 31:42
Many-to-one Word RNNs- 45:16
Generating Text with Character RNNs- 50:45
Attention Mechanisms and Transformers- 1:00:09

gireeshbogu
Автор

Since only few lectures left for this playlist. Can you pls start the series where you teach all this as how to implement in Keras? Like explaining how to write the RNN layers in Keras and other useful step so that we hands on to actually build something. It would be great to see these things actually work. Moreover can these RNN models be used in predicting scientific papers citations?

its_me
Автор

*making RNN video's shorter hit me harder than anything recently*

vinayreddy