Word2Vec : Natural Language Processing

preview_player
Показать описание
How do we turn words into vectors?

Рекомендации по теме
Комментарии
Автор

Finally, someone made a clear and concise introduction for Word2Vec! I admire you!

matthewson
Автор

Great video. One thing to add: instead of always discarding the context vectors at the end, another strategy (mentioned in other videos/articles) is to concatenate or add the two vectors for a word.

revathik
Автор

You are a genius! You are able to explain abstract things well with only white board (no animation!)

YingleiZhang
Автор

Great explanation, it’s pure gold! Can you make a video in terms of hierarchical softmax? It confuses me for a long time.

yangwang
Автор

This is amazing!! I remember trying to understand this back when it was first published, and i failed so hard...thank you

a
Автор

Exceptionally simplified explanation. Thanks 😊

drsandeepvm
Автор

I appreciate your explanations.
I was stuck on Word2Vec. However, you explainde this more than enough.
Thank you so much!!!

sefinehtesfa
Автор

Great video. You mentioned that we discard the context vectors and take the main vectors as our final word embeddings. I just want to add that based on the literature, you can also add the main and context vectors, or concatenate them.

jett_royce
Автор

Probably the best video explanation I watched. thank you.

DC-tqkh
Автор

The best and most concise explanation of Word2Vec that I've seen so far. I probably need to go back and review gradient descent again, because updating the weights is still confusing.

DennisRice-lhnd
Автор

You're so good! Thank you very much for this explanation, my minds are so slear after it, magic!

nobodyelse
Автор

Awesome video! numerical example was particularly helpful. Cheers :)

bharathtvadhoola
Автор

Wow, such a nice clear video! Many thanks!

dominicprior
Автор

Explained in lucid manner. Nice video.

saikatroy
Автор

Thank you for the video! Great content!

zhiyili
Автор

Incredible and amazing explanation! Thanks so much for such great content!

vctorroferz
Автор

Great video! Thank you for explaining this so clear.

junekang
Автор

This explanation is so well done! Thank you!

Recoils
Автор

Awesome explanation, you safe me a lot of time. Thank you! :)

MikeKittelberger
Автор

Another fantastic video! The main/context embeddings kind of confused me but the ending really cleared it up. Curious to know if there is a deliberate way of choosing # of dimensions or if its simply trial and error. On a side note, will you be participating in the "66 days of data" challenge this July?

Halo-uznd