filmov
tv
Integer embeddings in PyTorch
Показать описание
In this video, we implement a paper called "Learning Mathematical Properties of Integers". Most notably, we use an LSTM network and an Encyclopedia of integer sequences to train custom integer embeddings. At the same time, we also extract integer sequences from already pretrained models - BERT and GloVe. We then compare how good these embeddings are at encoding mathematical properties of integers (like divisibility by 2 and primality).
00:00 Intro
00:41 Ideas and high level explanation
02:56 Data - On-line encyclopedia of Integer Sequences
03:58 Data - raw download exploration
05:43 CustomDataset - implementation
09:15 CustomDataset - testing it out
11:36 Network - implementation
15:54 Network - testing it out
16:58 Evaluation utilities
19:05 GloVe embeddings parsing
22:09 BERT embeddings parsing
24:32 LSTM training script
30:58 Experiments to be run
31:39 Results: LSTM guess next
33:53 Results: Metrics (TensorBoard)
36:33 Results: Embeddings projections (TensorBoard)
40:05 Outro
00:00 Intro
00:41 Ideas and high level explanation
02:56 Data - On-line encyclopedia of Integer Sequences
03:58 Data - raw download exploration
05:43 CustomDataset - implementation
09:15 CustomDataset - testing it out
11:36 Network - implementation
15:54 Network - testing it out
16:58 Evaluation utilities
19:05 GloVe embeddings parsing
22:09 BERT embeddings parsing
24:32 LSTM training script
30:58 Experiments to be run
31:39 Results: LSTM guess next
33:53 Results: Metrics (TensorBoard)
36:33 Results: Embeddings projections (TensorBoard)
40:05 Outro
Integer embeddings in PyTorch
torch.nn.Embedding explained (+ Character-level language model)
torch.nn.Embedding - How embedding weights are updated in Backpropagation
PyTorch in 100 Seconds
How to train word embeddings using the WikiText2 dataset in PyTorch
Word embeddings (part 2): Evaluation and PyTorch implementation
Encoding a Feature Vector for PyTorch Deep Learning (4.1)
Word Embeddings || Embedding Layers || Quick Explained
Pytorch for Beginners #30 | Transformer Model - Position Embeddings
Train pytorch rnn to predict a sequence of integers
Sentence Embedding Learning with TPU + Pytorch
Categorical Embeddings in Structured Data
L15.7 An RNN Sentiment Classifier in PyTorch
244 - What are embedding layers in keras?
Rotary Positional Embeddings: Combining Absolute and Relative
Denoising Diffusion Probabilistic Models Code | DDPM Pytorch Implementation
Train PyTorch RNN to predict a sequence of integers, updated 2022
Vision Transformer Quick Guide - Theory and Code in (almost) 15 min
PonderNet in PyTorch
Pytorch Transformers from Scratch (Attention is all you need)
Quick explanation: One-hot encoding
PyTorch for Deep Learning & Machine Learning – Full Course
L19.2.2 Implementing a Character RNN in PyTorch --Code Example
Vision Transformers (ViT) Explained + Fine-tuning in Python
Комментарии