Positional encodings in transformers (NLP817 11.5)

preview_player
Показать описание
Рекомендации по теме
Комментарии
Автор

Thanks! Best explanation on this that I've seen so far -and I've seen a lot.

guestvil
Автор

best explanation of positional encoding that I've seen. TY

delbarton
Автор

This helped a lot. Fantastic intuitive explanation.

Josia-pm
Автор

Great lecture Prof. May I ask what are d=6 and d=7 here? Is it the embedding dimension? If so, for d=6, we should be having 3 pairs of sine-cosine waves right?

arnabsinha
Автор

Thank you, really great explanation, I think I can understand it now.

chetterhummin