167 - Text prediction using LSTM (English text)

preview_player
Показать описание
LSTMs are great for timeseries forecasting and sequence predictions. This makes them appropriate for natural language prediction. This tutorial explains the process of training an LSTM network on English text and predicting letters based on the training.

Code generated in the video can be downloaded from here:
Рекомендации по теме
Комментарии
Автор

Thank you very much !!
The most Underrated channel on YT

pranavkushare
Автор

Wow, I have been seeing the videos on language modeling and LSTM, yours is the best explanation so far. Thank you Sir.

mr.n.v.subbareddy
Автор

Great video!! I have seen other tutorials with the char_to_int and int_to_char code but yours was the best explanation by far. Keep up the wonderful work. Very helpful stuff.

pmiller
Автор

23:32 You dont need to recompute the softmax in your prediction, the probabilities of prediction are in softmax form ( sum (proba) =1, your output activation is softmax) just take argmax of your prediction giving you corresponding character from your dictionary

WahranRai
Автор

Thank you. Looking forward to the videos about transformer!

leamon
Автор

can you demonstrate how this model can be implement in a normal front-end text box made by flask...
like we write something and it renders some autocompleted text.

varshikjain
Автор

Thank you very much! Thanks to you, I was able to train a network with discord logs to create a bot that talks like my friend.

OsscarBones
Автор

Thank you a lot sir !
Your explanation is clear.
I tried the process and have a question : for the same model, why for a same seed, i have different prediction (when I run again)?

yhi
Автор

Hello sir, can you pls help in solving the error at prediction...not able to convert int to char giving errors: next_char = int_to_char[next_index] is not working

sonalichakrabarty
Автор

Thank you,
could you please suggest which is the best approach for handwritten charcater recognition

akshathamanjunath
Автор

Thank you very much it gives me a hint for my project. I am trying to do project work on "Automatic Question Generation using LSTM". Do you have any project work related to this?

waleligntewabe
Автор

Can you please advice on this..Getting error here-->
model.add(LSTM(128, input_shape=(seq_length, n_vocab)))

NotImplementedError: Cannot convert a symbolic Tensor (SampleLSTM/strided_slice:0) to a numpy array. This error may indicate that you're trying to pass a Tensor to a NumPy call, which is not supported

rvsivabalan
Автор

This was a great video. Can anyone guide me to a full implementation regarding data to text generation using NLG

sameerasalih
Автор

can i train the charcaters using CNN make a database and keep and in the testing phase can i input a word and segment the charcters based on the trained input charcters

akshathamanjunath
Автор

do we have tranformer network turtorial by far?

guohanzhao
Автор

Thanks a lot... You are amazing... Could you please add evaluation metrics (perplexity and BLUE score)for text generation )

thelastone
Автор

Could you pls make video on text line images recognition using lstm like recognition of line of text or words from images, and how we can prepare the dataset like ground truth and vectoring the images.

efremyohannes
Автор

Sir can I try transformers for multivariate time series analyses??

rathnakumarv
Автор

Hi sir
Sir we are stucked in a Text generation. we train our model on 70 epoch and with batch size 150.
But when we give pre define sentence its creating just a symbols like that
";;$?ha j +#?
Gan j@($!")#?
While our coding is same as u done abmbd write theb What's wrong in it ?

peshawriankhan
Автор

plzzz make a video on UNet++ and attention UNet know no one better than u can make it so simple to understand

aakashverma
join shbcf.ru