Long Short-Term Memory for NLP (NLP Zero to Hero - Part 5)

preview_player
Показать описание
Welcome to episode 5 of our Natural Language Processing with TensorFlow series. In this video we’re going to take a look at how to manage the understanding of context in language across longer sentences, where we can see that the impact of a word early in the sentence can determine the meaning and semantics of the end of the sentence. We’ll use something called an LSTM, or Long Short-Term Memory to achieve this.

Рекомендации по теме
Комментарии
Автор

I love your simple & crystal clear explanation which anyone can understand easily, Laurence

multiuniverse
Автор

This channel is more like getting familiar with terminology than anything else...I am here after I've watched it, finished the coursera related course (TF specialization), and came back again to here...It helps as an icebreaker to the topic. Something Laurence is gifted at!

saeeduchiha
Автор

What should I do for multi class text classification?

tusharys
Автор

I really enjoyed the fluency of the presentation and how you make it sound easy to understand. I have a question for you. how can i use word embeddings of a text column in my data along with a set of numerical and categorical features to fit a bidirectional lstm? I have a text column in my data and the rest of features are mixture of numerical and categorical

saberfallahpour
Автор

Hello Lawrence
I am a student in Tensorflow in practice specialization. I want to thank you for all the efforts you and andrew are putting in making AI and coding with tensorflow framework a common place skill.

I am in the 2nd course of specialization and have a good feeling about my progress. I was thinking of taking tensorflow certificate exam. But I got few doubts in my mind whether just this specialization will be enough or not for test preparation.

Till now in the course module covered is just tf.keras but we have lots of other modules as well in tensorflow like tf.data, tf.saved_model, tf.estimator, tf.errors etc.
What about them ? Will the test comprise of stuffs from those modules ?

Will we see you applying and explaining those modules in any of courses in future ?

lastly, can i help you in anyway in activities that you are doing ?

kk-ypme
Автор

This is so much fun! I really hope you could teach Time Series and forecasting please

HealthyFoodBae_
Автор

Sir, can we use Tensor flow for unsupervised learning like text clustering using word embedding ??

chandansingh-ddfj
Автор

When you stack multiple LSTM layers, shouldn't one make sure the activation functions are non-linear so that it has a positive effect on model capacity? I didn't see that in the example shown in this video.

followiamgucci
Автор

Not totally on the topic, but are there public datasets available for NLP in domains such as finance (stock market)? Sentiment analysis seems trending, but you need somehow a labeled dataset of financial news/Twitter posts etc. Or without any labelling, you are left to use some kind of unsupervised learning along, but it might affect your overall structure etc.

secretsecret
Автор

Sir, Why TensorFlow discontinued it's GPU support on Windows? I really wish they bring it back.

willgordon
Автор

How to implement of "CNN+BiLSTM" Of NLP project.

shahidulislamzahid
Автор

hey if we believe in you, I think we cannot acheive any goals regarding to this channel specifically

saiabhi