Understanding Word Embedding|What is Word Embedding|Word Embedding in Natural language processing

preview_player
Показать описание
Understanding Word Embedding|What is Word Embedding|Word Embedding in Natural language processing
#WordEmbedding #WordEmbeddingInPyton

Hello All,
This is Aman and I am a data scientist.

About this video:
In this video I explain about word embeddings. Below questions are answered in this video:
1. What is Word embedding?
2. Why is word embedding needed?
3. What are types of word embedding?
4. What is frequency based word embedding?
5. What is prediction based word embedding?
6. What is word2vec and Glove
About Unfold Data science: This channel is to help people understand basics of data science through simple examples in easy way. Anybody without having prior knowledge of computer programming or statistics or machine learning and artificial intelligence can get an understanding of data science at high level through this channel. The videos uploaded will not be very technical in nature and hence it can be easily grasped by viewers from different background as well.

Join Facebook group :

Follow on twitter : @unfoldds

Follow on Instagram : unfolddatascience

Watch python for data science playlist here:

Watch statistics and mathematics playlist here :

Watch End to End Implementation of a simple machine learning model in Python here:

Learn Ensemble Model, Bagging and Boosting here:

Access all my codes here:

Рекомендации по теме
Комментарии
Автор

I am highly excited for NN. You are here to really teach us, or else you would had directly started word2vec. Thanks a lot Aman. The step wise approach you have taken is the key to our understanding and development.🙂

anirbansarkar
Автор

Awesome,
Bro you explain really well.
Start Neural Networks playlist and focus specially on different architectures like CNN, RNN, LSTM, GRU, transformers etc.
It will be a great help to us

ShaidaMuhammad
Автор

This is the best explanation of GloVe so far. Thanks for the knowledge

deepankarkanand
Автор

Tks a lot Aman... No much reputations...perfect and simple explanation....very easy to understand...Day by Day your channel will become famous for sure...Tks...

yogeshbharadwaj
Автор

Always excited specially the way u explain the concepts....

SuperShiva
Автор

Highly excited... Please teach encoder decoder..bert etc

r
Автор

Thanks aman that was good intro on basic. Can you please elaborate on what topic we need to be more focused and have in depth knowledge when it comes to NLP and neural network . And please when you create playlist with NLP r neural networks please cover one full topic along with practical implementation so that we can learn a lot

saravananbaburao
Автор

Excellent Aman ! :)
Start Neural Networks playlist and start with architectures like CNN, RNN, LSTM

salehmohamed
Автор

God Bless You Boy. Excellent explanation. Hard to Find Skill

asheeshmathur
Автор

you explain very easy way brother, God bless you, Please Create Series on DeepLearning too

junedansari
Автор

Have you explained Word2vec model?
If Yes then please give me the link

ShaidaMuhammad
Автор

Aman's video is informative but your white board should be big.

sunitabnsl
Автор

Let's say we have two sentence
1. I like Apple's.
2.I like Apple's Macbook.

So if I will use word2vec embedding which preserves similar semantics then both words will have the same vector for the word .
So how to handle the situation where I will have the same word but a different meaning.

singhpishusingh
Автор

Hello aman can u provide ubr LinkedIn Id so that I can able to connect

dhilipmaharish
Автор

I am highly excited for NN. You are here to really teach us, or else you would had directly started word2vec. Thanks a lot Aman. The step wise approach you have taken is the key to our understanding and development.🙂

anirbansarkar
Автор

I am highly excited for NN. You are here to really teach us, or else you would had directly started word2vec. Thanks a lot Aman. The step wise approach you have taken is the key to our understanding and development.🙂

anirbansarkar