The Illustrated Word2vec - A Gentle Intro to Word Embeddings in Machine Learning

preview_player
Показать описание
The concept of word embeddings is a central one in language processing (NLP). It's a method of representing words as numerically -- as lists of numbers that capture their meaning. Word2vec is an algorithm (a couple of algorithms, actually) of creating word vectors which helped popularize this concept. In this video, Jay take you in a guided tour of The Illustrated Word2Vec, an article explaining the method and how it came to be developed.

By Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg Corrado, and Jeffrey Dean

---

---

More videos by Jay:

Language Processing with BERT: The 3 Minute Intro (Deep learning for NLP)

Explainable AI Cheat Sheet - Five Key Categories

The Narrated Transformer Language Model
Рекомендации по теме
Комментарии
Автор

I’ve watched a lot of videos on YouTube. So many with animations etc. I nearly lost hope thinking I would never be able to grasp this concept. This is the only one that truly explains what the word embedding is and how it’s being derived in just a simple manner. Thank you so much

debbs_io
Автор

thank you for saying that bit about word2vec being outdated. a coworker was lobbying to use it for one of our projects and this helped nip that in the bud.

bazgo-odyj
Автор

One unsolicited piece of advice. You got a profound knowledge of AI. You should share this knowledge by making more videos on several AI topics. I hope every AI aspirant gets a chance to watch your videos.
Keep it up..:)

priyam
Автор

Thanks for these videos and your blog, I've learned so much from you. I always read your blog entries before dive in the original paper.

andrestellez
Автор

Personality scores is a great example!

MannyBernabe
Автор

Great job! I enjoy very much your channel and blog! THK!

nelsonpullella
Автор

finally found the video, if you haven't watched then this is the one .

ArunNegi-fidi
Автор

This guy is the best. He is a good guy.

RoccoSwat
Автор

Thank you. Not related but I really want to know, what font are you using in your blog poster?

ruikang
Автор

Thank u so much its great Explanation clear understand

abdikadermohamed
Автор

Very good explaination, one more thing, is word2vec using dimensional reduction too?, we can choose 50, 100, 200 dimensions? but how it works? Thanks

lemoniall
Автор

jay, how does training LLMs differ from training text embedding models? or is an embedding model a byproduct of training an LLM? Like in transformers where text are converted to embeddings first before being fed to to the transformer blocks. Thanks!

bagamanocnon
Автор

3:32 "...Jay is 38 on the 0 to 100 scale... so -.4 on the -1 to 1 scale...": How is that? I get -.24. If it's -.4 on the -1 to 1 scale, that's 30 on the 0 to 100 scale. Please fix my math.

sershsershsersh
Автор

why are the person turning big and turning small all the time through the video?

xudongguo-zu
Автор

instead of explaining you went scrolling pages'. it was better if you have just kept it short and may be make other vid for subsequent sections.

Udayanverma