Neural Transformer Encoders for Timeseries Data in Keras (10.5)

preview_player
Показать описание
In this video we see how the encoder portion of a transformer can be used to predict timeseries data.

Code for This Video:

Follow Me/Subscribe:

Рекомендации по теме
Комментарии
Автор

you literally saved my life (the only tutorial in transformers for time series)

azizshaba
Автор

Jeff even if you win the lottery or figure out the bitcoin we need you to keep on teaching us please

Aguiraz
Автор

Fantastic video, congrats. I understand you don’t need positional encoding, but I think it should be more complete, and is a very important part for more real and deeper examples.

pverd
Автор

This channel was a great discovery! Thanks a lot for all you share, Jeff

LeanTaoTe
Автор

Great video. I understood it in only 8 minutes. Other videos are like an hour long

naasvanrooyen
Автор

Hello Jeff. Must say I sure am glad you are creating this great content. Hope you get to chill in a beach somewhere as well.

niaguilar
Автор

Absolutely loved you video: short, concise, to the point. I am viewing your video, because I am preparing for a proposal defense, and one of the questions I am trying to answer is whether or not RNN-LSTM approach for time series prediction is better than Transformers. I would appreciate if you point me in a right direction of where I could find such information. Thank you!

mikedramatologist
Автор

This is awesome Jeff!! As always, thank you so much.

rt_shll
Автор

Thank you so much! Your material is amazing!

jairsales
Автор

Thanks for the amazing content Jeff, can you please let us know how we can incorporate the position embedding as part of this architecture.

saeedrahman
Автор

Yes this was useful to me. Thank you for sharing.

edgetrading
Автор

Hi Jeff. Thank you so much for your amazing videos! On your prior transformer video you mentioned the importance of positional encoding but I notice that it isn’t built into this time series model where I’d imagine the relative position is important to accurate prediction. Is it already baked into Keras multi headed attention component?

thomastran
Автор

Hi Prof why is decoder not required in time series prediction? Thks so much

dbgm
Автор

Thanks for your explanation. It seems that for time series prediction, you only need a transformer encoder, and don't need the transformer decoder part, is that right? How to predict multiple steps?

amyzimmermann
Автор

Hi can you help me? What about building the decoder part? I want to do forecasting using transformers in keras but I could not find any documentation I will be thankful if you can help me

somayehseifi
Автор

Thank you for the nice video. I have a question. When using the function to_sequences(), you discarded first x observations where x = sequence length, right? So if we choose sequence length = 100, we will discard the first 100 data points for both train and test sets. Is there any way to keep those data points? Thank you

lzdddd
Автор

Dear Jeff somthing confused me. If have here a univariate feature of sonspot, why head size of transformer is 256. I meant head sized should not be amount of feature here? Please explain

abdi
Автор

Is it normal that the MultiHeadAttention layer in keras is really, I mean, really slow? I have checked the sample model of transformer for time series prediction in keras documentation and just that layer, makes the model work like 7 minutes per epoch instead of the 2 seconds I get if Iremove the multiheadattention Is it because a poor implementation, or because the multihead algorithm is THAT complex no matter what you do? I'm using a gpu for the training (rtx

davidcristobal
Автор

Thank you Jeff! Question, can this be used for text (non-numeric) sequences? For example, pizza observed sequence of events 🙂 {Dough Sauce Toppings Cheese Bake Cut Box Deliver}. We prompt Dough Sauce Toppings Cheese <mask> .... we should get bake, not Cut Box Deliver. Thank you!

TheUltimateBaccaratApp
Автор

you would still be teaching this course. The beach gets boring after about a month

joshuakessler