Positional embeddings in transformers EXPLAINED | Demystifying positional encodings.

preview_player
Показать описание
What are positional embeddings / encodings?

► Outline:
00:00 What are positional embeddings?
03:39 Requirements for positional embeddings
04:23 Sines, cosines explained: The original solution from the “Attention is all you need” paper

▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
NEW (channel update):
🔥 Optionally, pay us a coffee to boost our Coffee Bean production! ☕
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀

Paper 📄

Music 🎵 :

---------------------------
🔗 Links:

#AICoffeeBreak #MsCoffeeBean #MachineLearning #AI #research​
Рекомендации по теме
Комментарии
Автор

I found no explanation for this anywhere and when reading the paper missed the detail that each tokens positional encoding consists of multiple values (calculated by different sine functions). Your explanation and visual representation finally made me understand! Fourier transforms are genius and I'm amazed in how many different areas they show up.

anonymousanon
Автор

love how the "Attention is all you need" paper appears with an epic-like bgm

yimingqu
Автор

This is the most intuitive explanation of the positional encoding I have come across. Everyone out there explain how to do it, even with code, but not the why which is more important.
Keep this up. You have earned my subscription.

sqripter
Автор

I had my morning coffee with this and will make an habit!

deepk
Автор

+ 1 for video on relative positional representations!

hannesstark
Автор

I couldn't find a satisfying explanation anywhere. This video finally made me understand things in a bit more detail especially the use of sine and cosine functions across multiple dimensions.
Thank you! You're awesome.

parthvashisht
Автор

i've read numerous articles explaining the positional embedding so far.. however, it is surely the greatest & clearest ever

yyyang_
Автор

Great stuff :) Would love to see more of that, especially for images or geometry!

Phenix
Автор

You've solid understanding of Mathematics of Signal Processing

rahulchowdhury
Автор

Great explanation of the intuition of positional encodings used in the Transformer!

magnuspierrau
Автор

Probably the clearest explanation for positional encoding:D

yusufani
Автор

Just watched this again for a refresher; thee best video out there on the subject!

ylazerson
Автор

The best explanation of how exactly position embeddings work !

exoticcoder
Автор

+1 for more vids on positional encodings.

adi
Автор

great explanation of positional embeddings. Just what I need.

elinetshaaf
Автор

This is probably the best explanation of this topic on Youtube! Great work!

mbrochh
Автор

This video is a clear explaination of why you shouldn't add your positional encoding but concat.

haluk
Автор

Amazing content. After seeing this all the articles and research papers makes sense.

gauravchattree
Автор

I was browsing through some channels after first stopping on Sean Cannells and I noticed your channel. You got a great little channel building up here. I decided to drop by and show some support. Keep up the great content and I hope you keep posting :)

ConsistentAsh
Автор

Very intuitive. I know there is sine cosine positional encoding but it is actually effective that I got it here..👍👍

jayjiyani