filmov
tv
Stanford CS224N: NLP with Deep Learning | Winter 2021 | Lecture 1 - Intro & Word Vectors
Показать описание
This lecture covers:
1. The course (10min)
2. Human language and word meaning (15 min)
3. Word2vec algorithm introduction (15 min)
4. Word2vec objective function gradients (25 min)
5. Optimization basics (5min)
6. Looking at word vectors (10 min or less)
Key learning: The (really surprising!) result that word meaning can be representing rather well by a large vector of real numbers.
This course will teach:
1. The foundations of the effective modern methods for deep learning applied to NLP. Basics first, then key methods used in NLP: recurrent networks, attention, transformers, etc.
2. A big picture understanding of human languages and the difficulties in understanding and producing them
3. An understanding of an ability to build systems (in Pytorch) for some of the major problems in NLP. Word meaning, dependency parsing, machine translation, question answering.
Professor Christopher Manning
Thomas M. Siebel Professor in Machine Learning, Professor of Linguistics and of Computer Science
Director, Stanford Artificial Intelligence Laboratory (SAIL)
0:00 Introduction
1:43 Goals
3:10 Human Language
10:07 Google Translate
10:43 GPT
14:13 Meaning
16:19 Wordnet
19:11 Word Relationships
20:27 Distributional Semantics
23:33 Word Embeddings
27:31 Word tovec
37:55 How to minimize loss
39:55 Interactive whiteboard
41:10 Gradient
48:50 Chain Rule
Комментарии