What is LSTM (Long Short Term Memory)?

preview_player
Показать описание

Long Short Term Memory, also known as LSTMs, are a special kind of Recurrent Neural Network, or RNN, architecture capable of learning long-term dependencies as well as a solution to the vanishing gradient problem that can occur when training traditional RNNs.

In this lightboard video, Martin Keen with IBM, breaks down why we need LSTMs to address the problem of long-term dependencies, how the cell state and its various gates help transfer relative information in a sequence chain, and a few key LSTM use cases.

#LSTM #RNN #AI
Рекомендации по теме
Комментарии
Автор

Martin you are a wonderful teacher! Thank you very much for the explanations.

simonbax
Автор

I'm in love with his way of teaching!

amalkumar
Автор

So are going to ignore the fact he wrote everything backwards on a clear glass wall?

channel-xjrp
Автор

Very helpful lecture. Keep up the good work!

DieLazergurken
Автор

Very useful and helpul. This is a hard topic to understand as readily way, but you can did it in just 8 minutes. Thanks for that Mr. Martin and company. Greetings from Mexico!

engr.inigo.silva
Автор

After watching Martin Keen's explanations, you don't need another short explanation

IsxaaqAcademy
Автор

Great advancement in time. Glad to have a better understanding. Thank you folks

toenytv
Автор

I'd appreciate a practical video. For instance a medical longitudinal study with N patients and several visits over some time, where they get a certain medication. What would the input need to look like? What is the process?

medical-informatics
Автор

Good lecture, Please make video on "transformer based models" too,

It will be very helpful

akashthoriya
Автор

Fantastic explanation. Please keep making more.

vivekpujaravp
Автор

Wait the Homebrew Challenge dude does Deep Learning too?!

ashleygillman
Автор

Imagine if the model takes the context "My name " and predicts "J" as the next letter 🤣

willdrunkenstein
Автор

such an informative lecture, thank you so much

waleedt_trz
Автор

thank you martin and team. great work.

RishabKapadia
Автор

Good lecture ! Thank you very much for the explanations.

WangY-ipsb
Автор

Sir can you do a video of Rnn example by giving numerical values

jayasreechaganti
Автор

Too many thanks, your lecture is very helpful, could you please explain all gates from LSTM (Forget Gate, Learn Gate, Remember Gate & Usegate(OutPut))?

balenkamal
Автор

Anyone else distracted by the fact that he's writing backwards? Great vids, keep it up

TaylorDeiaco
Автор

Hi Martin as you given an example of martin and jenifer, When questions asked about jenifer, martin is no more reelevant so iit'll be forgotten right, if anytime in the future question asked about martin does that relevant to the LSTM? I mean will it be able to remind about martin even after forgetting it?

wqsnzbd
Автор

are there any new algorithms more powerful and more efficiency work than traditional neural network ?

sathiraful