Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 7 – Vanishing Gradients, Fancy RNNs

preview_player
Показать описание

Professor Christopher Manning & PhD Candidate Abigail See, Stanford University

Professor Christopher Manning
Thomas M. Siebel Professor in Machine Learning, Professor of Linguistics and of Computer Science
Director, Stanford Artificial Intelligence Laboratory (SAIL)


0:00 Introduction
0:34 Announcements
3:06 Overview
3:40 Today's lecture
4:36 Vanishing gradient intuition
6:22 Vanishing gradient proof sketch
14:42 Why is vanishing gradient a problem?
16:11 Effect of vanishing gradient on RNN-LM
20:28 Why is exploding gradient a problem?
22:13 Gradient clipping: solution for exploding gradient
28:18 How to fix vanishing gradient problem?
29:41 Long Short-Term Memory (LSTM)
40:26 How does LSTM solve vanishing gradients?
44:37 LSTMs: real-world success
46:26 Gated Recurrent Units (GRU)
52:36 Is vanishing/exploding gradient just a RNN problem?
57:55 Recap
59:00 Bidirectional RNNs: motivation
Рекомендации по теме
Комментарии
Автор

it seems like the order of the videos have been shuffle

yashiroisana