LSTM and GRU | Long short-term memory and Gated Recurrent Unit

preview_player
Показать описание
LSTM = Long short term memory
GRU = Gated Recurrent Unit
RNN may fail under long term dependency. LSTM and GRU comes into the picture to solve this.
In LSTM we have:
1. Forget gate
2. Input gate
3. Cell state
4. Output gate

In GRU we have,
1. Update gate
2. Reset gate
Both of these LSTM and GRU are powerful enough to handle competitive task.
We have to choose on the basis of our project whether to choose LSTM or GRU.

Click on this link to subscribe my channel:

Searching for more videos? Then here is the link:
🢡Complete Natural Language Processing Playlist:

🢡To learn about deep learning:

🢡To learn about Machine learning and data science tutorial:

🢡To learn about project of Machine learning and data science:

🢡To learn about python full course:

🢡To learn about java full course:

🢡Project discussion: What project can we make from python, PHP, java, C/C++:

🢡Learn in just one video (coding like C,C++,Java, PHP, python and many more)

➤visit my channel to find more videos:

#LSTM
#GRU
#NLP

यस च्यानलमा अप्लोड भएका video हरु र अन्य code वा file हरु download गरी अन्य च्यानलमा अप्लोड भएको पाइएमा कडा भन्दा कडा कारबाही गरिनेछ।
To upload my videos to another channel or misuse my files or codes is strictly prohibited. We have no concern regarding embedding videos on many other sites. Keep loving, Keep supporting!
Рекомендации по теме
Комментарии
Автор

When I watch from another channel the same topic i don't understand but here I understand easily. You are awesome 🙂

sanjaysapkota