Word embedding using keras embedding layer | Deep Learning Tutorial 40 (Tensorflow, Keras & Python)

preview_player
Показать описание
In this video we will discuss how exactly word embeddings are computed. There are two techniques for this (1) supervised learning (2) self supervised learning techniques such as word2vec, glove. In this tutorial we will look at the first technique of supervised learning. We will also write code for food review classification and see how word embeddings are calculated while solving that problem

🔖 Hashtags 🔖
#WordEmbeddingUsingKeras #WordEmbedding #EmbeddingLayerKeras #WordEmbeddingdeeplearning #WordembeddingswithKeras #wordembeddinginpython #wordembeddingpython #wordembeddingtensorflow

#️⃣ Social Media #️⃣

❗❗ DISCLAIMER: All opinions expressed in this video are of my own and not that of my employers'.
Рекомендации по теме
Комментарии
Автор

I have to see you are a amazing mentor. Your tutorial insights me so much. 10 minutes ago, I have nothing about embedding tables. But now I transparently know it.

changqi
Автор

Thank you so much for sharing! Just starting out with tensorflow, you have saved me a lot of time, please keep sharing:)

alifia
Автор

It‘s definitely the best video to learn word embedding on Youtube.

JIUSIZHENG
Автор

Thank you very much for this and previous videos. Exaplaning very clearly about embedding .

tmorid
Автор

Thanks a lot for all the contents. Your explanation is really awesome.

ojsrktf
Автор

Thank you for such a nice video, it was very informative and easy to understand. Keep it up

prernasingh
Автор

very awesome and easy to understand video...thanks mate

jerkmeo
Автор

Amazing Dhaval... It gave me a very clear idea.

koushik
Автор

Hi, Thank you so much for all these wonderful set of videos. Can you kindly upload something related to the Time Distributed layer and what it exactly does?

djs
Автор

Thank you so much for this wonderful video

himanshusirsat
Автор

THANK you so much for this tutorial. It's taught me a lot I really needed to know! I'm subscribing. I hope you'll continue to make more in depth videos about Tensorflow and all machine learning topics!

TheVerbalAxiom
Автор

Sir Amazing Explanation. Please make a video on Glove. Thank you.

hardikvegad
Автор

That was a very valuable tutorial, thank you very much Sir!!!

jyotikokate
Автор

You're amazing keep doing what you're doing, also do you have any reinforcement learning theory videos.

marcusrose
Автор

Another amazing video! Thank u so much. How can use this model to predict Y for new review comments that contains more or different words than in the training dataset?

sergiochavezlazo
Автор

Great videos Sir, very informative . Could you please add next videos and complete playlist a little bit sooner so many academic people are studying from this playlist. Thank you for making such videos. eagerly waiting for word2vec and bert model

adityapradhan
Автор

It is a great tutorial. Thanks. You have mentioned that you will paste the link of Jason Brownlee article.. but it is missing.

regivm
Автор

Great videos, keep it up. I was wondering how do we create users or items embedding? let's say to calculate similarity between 2 users to recommend a certain product?

mohammadkareem
Автор

Great video! Thanks! How do we choose which words will be in our vocabulary? For example, if our voc is 5000 do we choose the most 5000 most frequent words?

randb
Автор

Please add more data science, machine learning and deep learning projects. From beginner to advance level.

MLDSInsights