Implementing Word Embedding Using Keras- NLP | Deep Learning

preview_player
Показать описание
Word embeddings provide a dense representation of words and their relative meanings.They are an improvement over sparse representations used in simpler bag of word model representations.Word embeddings can be learned from text data and reused among projects. They can also be learned as part of fitting a neural network on text data.

Please join as a member in my channel to get additional benefits like materials in Data Science, live streaming for Members and many more

Please do subscribe my other channel too

Connect with me here:

Рекомендации по теме
Комментарии
Автор

Steps To Follow


1. Sentences
2. One hot Representation-- index from the dic
3. Onhot Repre---> Embeddind Layer Keras
To form Embedding matrix
4. Embedding matrix

krishnaik
Автор

I cannot thank you enough for this particular video. The length to which you have gone to explain Word Embeddings is highly appreciated. A world of Thanks.

ijeffking
Автор

Thankyou so much Krish Sir, for this wonderful Playlist!, Learned a Lot!

gauravsahani
Автор

Thank you so much. I was having trouble understanding embedding which I need to implement for a model in one of my classes but you have made it very clear and easy to understand.

kfwvvep
Автор

Your video helped me a lot to understand it and to start working as a beginner.

spartacuspolok
Автор

better explanation than Stanford CS224N: NLP with Deep Learning | Winter 2019 course. thank you sir

googlecolab
Автор

Thank you, it was a great explanation!

maralazizi
Автор

Awesome explanation Krish hats off...thnx a ton

debashisghosh
Автор

Thankssss bro for putting such great effort in teaching.

vivekbhat
Автор

Krish this is a wonderful explanation. I just wanted to know that, I have watched your previous three videos on NLP and i want to learn this technique from scratch. So, is that enough or we have other topics to cover?

krishnaprasad-unhy
Автор

it make sense and simple to understand thx bro

affandibenardi
Автор

You and Codebasics are 2 eyes in teaching.There are some many doctors, only few will inject injection with out pain.

shaiksuleman
Автор

Your videos are really great sir. Hat's off you. Please, also make video on sentence embedding technique like infersent

rishikeshthakur
Автор

Krish, can you explain some applications of nlp using lstm like next word prediction, translation and Image captioning ?

RitikSingh-ubkc
Автор

Can't you do the one hot representation with Tensorflow 'tokenizer and sequence' fucntion?

theniyal
Автор

Hi krish,
It was great video
I am a beginner and I started seeing some projects on kaggle related to lstm I always had one doubt that is when to use a specific layer like in some projects they use two lstms they use dropout with certain value and these things are different in different projects and I get confused how did they choose these layers . I would request u to make a video on how can we know when to use a certain layer and why

cool_videos
Автор

The best than applied ai... Really best video ...

Gamezone-kqsx
Автор

Hi sir please suggest the face recognition CNN model..which is comparable with mobile face recognition

radhikapatil
Автор

Thnx sir I have one doubt for this ..what will be the benefit for the word representative is that we can predict sentience or what the advantages

ahmedosama
Автор

Thanks for the video. When we add the embedding we need to set the feature size. Here we set as 10. So how we the Keras know which of the 10 features need to be selected?

mohamednajiaboo