torch.nn.Embedding explained (+ Character-level language model)

preview_player
Показать описание
In this video, I will talk about the Embedding module of PyTorch. It has a lot of applications in the Natural language processing field and also when working with categorical variables. I will explain some of its functionalities like the padding index and maximum norm. In the second part of this video I will use the Embedding module to represent characters in an English alphabet and build a text generating model. Once we train the model, we we look into how the character embeddings evolved over epochs.

00:00​​​ Intro
01:23​​​ BERT example
01:56​​ Behavior explained (IPython)
04:25 Intro character-level model
05:29​​​ Dataset implementation
08:53​​​ Network implementation
12:12 Text generating function
14:00 Training script implementation
17:55 Launching and analyze results
18:31 Visualization of results
20:31 Outro

Credits logo animation
Рекомендации по теме
Комментарии
Автор

I love this man, and your keyboard sounds are the cherry on top.

the_osbm
Автор

It's so satisfying that you type anything.

yuehpo-peng
Автор

Cool to see the embedding evolution at the end!

DikkeHamster
Автор

At first, I did not understand the typing as sped up. Was very impressed by your typing skills.

TheRovardotter
Автор

Okay but what keyboard setup are you using. Or is it a sound effect?

LyurGG
Автор

Very informational, definitely looking forward seeing your next videos!

IqweoR
Автор

At 11:30 shouldn't you take the last layer of h instead of averaging over all of h? from my understanding the last layer of h represent the last prediction of the sequence.

ofeknourian
Автор

Seeing the steps to how the embeddings worked under the hood got me through my assignment, thank you!

culturosityofficial
Автор

Thank you very much sir. Deeply appreciate your hands on approach to teaching how to use embedding layer with LSTMs

azmyin
Автор

Please tell me you're speeding up the portion of the video where you type, if not you're making others look like peasants in typing.

vijayabhaskar-j
Автор

What's the keyboard? Sounds great when typing.

ろんサトシ
Автор

Very thocky! What switches are you using?

Danny-wevz
Автор

bro, what keyboard are you using. sound was amazing.

thepresistence
Автор

Thank you for this video, very clear. BTW, you can use higer dimensions for embedding and use PCA to reduce into two dimensions when you need to plot

fuat
Автор

dude your keyboard as weird typing sound, what keyboard are you using? Btw the video is nice

locorocozhong
Автор

Amazing quality! I'd love to know how you approach learning ML? Do you "just" visit courses at university and read math books? And what book/course would you recommend? Really appreciate an answer. Thanks!

redone
Автор

Just a thought
Create a model for your videos.
Input : Raw video
Output : Normal speed where you are speaking and fast forward where you're typing.

kajalchaurasia
Автор

Hey great video!! liked and subscribed, Hey I really wanted if you could make a video on how to work with attention layers in Pytorch? AS you know they are very cool but there is lack of resources on how to use it in your custom model.
It will be really great! Thanks!!

mrigankanath
Автор

I didn't watch the video carefully but does it explain how nn.embedding code is been written or it's just the behaviour of nn.embeddings is explained in the first of video?

popamaji
Автор

Doing ML in vim is absolutely gigachad

navins