Transformer-Based Time Series with PyTorch (10.3)

preview_player
Показать описание
This video covers deep learning as we explore the transformative power of Transformer models in time series analysis using PyTorch. Originally designed for natural language processing tasks, Transformer architectures have shown incredible potential when applied to time series data. This video provides an in-depth walkthrough on how to leverage these architectures for forecasting, anomaly detection, and more. From the underlying theory to hands-on coding sessions, we'll guide you through the intricacies of implementing Transformer-based models in PyTorch, ensuring you're well-equipped to harness their potential for your own time series data.

Code for This Video:

~~~~~~~~~~~~~~~ COURSE MATERIAL ~~~~~~~~~~~~~~~
📖 Textbook - Coming soon

~~~~~~~~~~~~~~~ CONNECT ~~~~~~~~~~~~~~~

~~~~~~~~~~~~~~ SUPPORT ME 🙏~~~~~~~~~~~~~~

~~~~~~~~~~~~~~~~~~~~~~~~~~~~
#Transformers #TimeSeries #PyTorch #DeepLearning #Forecasting #NLP #TimeSeriesAnalysis #TransformerModels #PyTorchTutorial #MachineLearning
Рекомендации по теме
Комментарии
Автор

The video is great, and well explained. Could you also tell me how I can implement this for a use case wherein I have multiple features in my dataframe and one regression output variable y? (6 inputs, one output)?

yuvrajpatra
Автор

Why the decoder layer is a nn.Linear? Would it be better to use nn.TransformerDecoderLayer and how to use it?

chunziWang-rbkd
Автор

Your explanation is very simple wow great job man, all respect

shanks
Автор

amazing, just what I needed, thanks! ❤

zoe.tsekas
Автор

thanks for the explanation, what if we use LSTM along with transformer(attention mechanism), it would be helpful or just make the model complex?

hoseinhabibi
Автор

Hi one question. When the model does the early stopping the Validation Loss hasn't decreased at all (It is also shown in this video). Is this model really learning anything or is it just for demonstration. Will any hyperparameter tuning make any difference?

georgevlachodimitropoulos
Автор

can transformers work with irregular time series?
Would be great to get some info about irregular timeseries, google bring me to CNN but need yet to test.

Dmitrii-qp
Автор

Do you plan to add up- and downcycling like in the metnet-3 model as well?

matthiaswiedemann
Автор

At least in PyTorch 2.2 I got a warning from the line `self.transformer_encoder = nn.TransformerEncoder(encoder_layers, num_layers)` in `TransformerModel`. Setting `enable_nested_tensor=True` in the TransformerEncoder fixed that.

SaschaRobitzki
Автор

still a little bit confused about why using just a linear layer as a decoder?

EvelynGolden-ys
Автор

Is this model considered a hybrid model ?

bothainah.r
Автор

Nice but you didn’t run the cells for us to see

vigoworkchannel