Understanding GloVe method for generating word embeddings

preview_player
Показать описание
#nlp #machinelearning #datascience #artificialintelligence
its the W matrix that gives us the word embeddings in the last equation
Рекомендации по теме
Комментарии
Автор

super explantion dude. i love the explanation of this topic

sasidharreddykatikam
Автор

how did we get Wi and Wk, the word vector representation of i and k

kindaeasy
Автор

1:37 your matrix is wrong, assuming the window size is 1, for 'a' 'good' will have the value 2 and also 'is' will have a value of 2, you consider both left and right words of the word for which you are filling the table, comeon mann!!

kindaeasy