filmov
tv
Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 6 – Language Models and RNNs
![preview_player](https://i.ytimg.com/vi/iWea12EAu6U/maxresdefault.jpg)
Показать описание
Professor Christopher Manning & PhD Candidate Abigail See, Stanford University
Professor Christopher Manning
Thomas M. Siebel Professor in Machine Learning, Professor of Linguistics and of Computer Science
Director, Stanford Artificial Intelligence Laboratory (SAIL)
0:00 Introduction
0:33 Overview
2:50 You use Language Models every day!
5:36 n-gram Language Models: Example
10:12 Sparsity Problems with n-gram Language Models
10:58 Storage Problems with n-gram Language Models
11:34 n-gram Language Models in practice
12:53 Generating text with a n-gram Language Model
15:08 How to build a neural Language Model?
16:03 A fixed-window neural Language Model
20:57 Recurrent Neural Networks (RNN)
22:39 ARNN Language Model
32:51 Training a RNN Language Model
36:35 Multivariable Chain Rule
37:10 Backpropagation for RNNs: Proof sketch
41:23 Generating text with a RNN Language Model
51:39 Evaluating Language Models
53:30 RNNs have greatly improved perplexity
54:09 Why should we care about Language Modeling?
58:30 Recap
59:21 RNNs can be used for tagging
Stanford CS224N: NLP with Deep Learning | Winter 2021 | Lecture 1 - Intro & Word Vectors
Stanford CS224N NLP with Deep Learning | 2023 | Python Tutorial, Manasi Sharma
Stanford CS224N NLP with Deep Learning | 2023 | PyTorch Tutorial, Drew Kaul
Stanford CS224N NLP with Deep Learning | 2023 | Lecture 8 - Self-Attention and Transformers
Stanford CS224N NLP with Deep Learning | 2023 | Lecture 9 - Pretraining
Stanford CS224N NLP with Deep Learning | 2023 | Hugging Face Tutorial, Eric Frankel
Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 1 – Introduction and Word Vectors
Stanford CS224N: NLP with Deep Learning | Winter 2020 | BERT and Other Pre-trained Language Models
Stanford CS224N NLP with Deep Learning | Winter 2021 | Lecture 6 - Simple and LSTM RNNs
Stanford CS224N NLP with Deep Learning | Winter 2021 | Lecture 7 - Translation, Seq2Seq, Attention
Stanford CS224N NLP with Deep Learning | Winter 2021 | Lecture 14 - T5 and Large Language Models
Stanford CS224N NLP with Deep Learning | Winter 2021 | Lecture 2 - Neural Classifiers
Stanford CS224N - NLP w/ DL | Winter 2021 | Lecture 5 - Recurrent Neural networks (RNNs)
Stanford CS224N NLP with Deep Learning | Spring 2022 | Guest Lecture: Scaling Language Models
Stanford CS224N NLP with Deep Learning | Winter 2021 | Lecture 12 - Natural Language Generation
Stanford CS224N NLP with Deep Learning | Winter 2021 | Lecture 9 - Self- Attention and Transformers
Stanford CS224N NLP with Deep Learning |Spring 2022|Guest Lecture: Building Knowledge Representation
Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 6 – Language Models and RNNs
Stanford CS224N NLP with Deep Learning | Winter 2021 | Lecture 12 - Question Answering
Stanford CS224N NLP with Deep Learning | Winter 2021 | Lecture 13 - Coreference Resolution
Stanford CS224N: NLP with Deep Learning | Winter 2019 | Lecture 15 – Natural Language Generation
Stanford CS224N NLP with Deep Learning | Winter 2021 | Lecture 18 - Future of NLP + Deep Learning
Stanford CS224N I NLP with Deep Learning | Spring 2022 | Socially Intelligent NLP Systems
Stanford CS224N - NLP w/ DL | Winter 2021 | Lecture 4 - Syntactic Structure and Dependency Parsing
Комментарии