PyTorch Tutorial - RNN & LSTM & GRU - Recurrent Neural Nets

preview_player
Показать описание
Implement a Recurrent Neural Net (RNN) in PyTorch! Learn how we can use the nn.RNN module and work with an input sequence. I also show you how easily we can switch to a gated recurrent unit (GRU) or long short-term memory (LSTM) RNN.

~~~~~~~~~~~~~~ GREAT PLUGINS FOR YOUR CODE EDITOR ~~~~~~~~~~~~~~

🚀🚀 JOIN MY NEWSLETTER 🚀🚀

🚀🚀 Get exclusive content on Patreon: 🚀🚀

If you enjoyed this video, please subscribe to the channel!

Code:

PyTorch Beginner Course:

Beginner Course Code:

PyTorch Tutorial 13 - Feed-Forward Neural Network:

RNN:

LSTM:

GRU:

Further Readings:

You can find me here:

#PyTorch

----------------------------------------------------------------------------------------------------------
* This is a sponsored link. By clicking on it you will not have any additional costs, instead you will support me and my project. Thank you so much for the support! 🙏
Рекомендации по теме
Комментарии
Автор

The more the code you explain, the more I love this channel. Just amazing. Keep it up.

teetanrobotics
Автор

Man.. You are really good in explaining.. Finally understood RNN, LSTM and GRU implementation from your video and the official documentation.

rigeltal
Автор

This video gives me a very clear picture of implementing RNN with Pytorch. I really appreciate it!

yijiesun
Автор

I love you. Very clear explanation. I have been looking for this content for a while

phatarapransaraluck
Автор

Thanks so much! I watched through all of your pytorch tutorial, and it is the best pytorch tutorial on youtube!!

jinyoungchoi
Автор

Great Tutorial. Thanks! Keep doing your jab.
I would appreciate more if you could add a small part to the video to explain how you can implement "many to many" case too.

hpaghababa
Автор

1시간 동안 인터넷 돌아다녀도 LSTM 차원 안 맞아서 고생했는데 이거 보고 좀 풀렸네. 굿

vowvfiy
Автор

Your tutorials are incredible! thank you so much!

HeyImAK
Автор

Very nice tutorial for Pytorch. Thanks for the initial code.

MrAstor
Автор

I was waiting for you to upload some videos on deep learning ...thanks so much !!!

dipanwitamallick
Автор

i am sure you will get 10K very soon and 100K and then 1M :) keep the vibe dude

username
Автор

great work. Any best practice ideas to figure out the relevant tensor shapes at various steps? That is challenging to me. THanks

DanielWeikert
Автор

Nice video Patrick and big congratulations on 10k subscribers, there's been a lot hard work for you to get to that point! I'm sure the best is yet to come too :) I know we make quite similar videos, but I am very happy that is the case because it drives me to make better videos and I learn a lot from you as well 👊 Also the more people doing videos about PyTorch, TensorFlow and machine learning in general the better it will be for people wanting to learn about these things which is ultimately the goal.

AladdinPersson
Автор

From freeecodecamp!! Thanks for your content man!

gradientO
Автор

Do you have tutorial of hyperparameters for RNN.? That would be great!!!!

zhengguanwang
Автор

you save my life, best wishes for you ^_^

jieluo
Автор

Good video, but wouldn't the classification be better if you'd connect all the outputs of the recurrent layer to the classes via the linear layer?

cviotzv
Автор

hi, how can we use pygad lib with pytorch? specially for optimization of RNNs

cedar
Автор

your videos are really helpful. For some weird reason, the torchsummary summary states that there are no learnable parameters in my rnn layers.. that must be a bug..or am i doing it wrong? XD

randomforrest
Автор

Thank you for you great videos.
15:02 little question : aren't we suppose to initialize the cell_state and the hidden_state (t-1) at each epoch instead of each lstm_cell (inner loop) ? Otherwise, the cell_state which is supposed to play the memory role will be useless...
Thanks !

nasser-eddinemonir