PyTorch Lightning Tutorial - Lightweight PyTorch Wrapper For ML Researchers

preview_player
Показать описание
PyTorch Lightning is a lightweight PyTorch wrapper that helps you scale your models and write less boilerplate code. In this Tutorial we learn about this framework and how we can convert our PyTorch code to a Lightning code.

Lightning is nothing more than organized PyTorch code, so once you’ve organized it into a LightningModule, it automates most of the training for you.The beauty of Lightning is that it handles the details of when to validate, when to call .eval(), turning off gradients, detaching graphs, making sure you don’t enable shuffle for val, etc…

~~~~~~~~~~~~~~ GREAT PLUGINS FOR YOUR CODE EDITOR ~~~~~~~~~~~~~~

🚀🚀 JOIN MY NEWSLETTER 🚀🚀

🚀🚀 Get exclusive content on Patreon: 🚀🚀

If you enjoyed this video, please subscribe to the channel!

Code:

PyTorch Lightning:

My PyTorch Course:

Tensorboard Tutorial:

You can find me here:

#Python #PyTorch

----------------------------------------------------------------------------------------------------------
* This is a sponsored link. By clicking on it you will not have any additional costs, instead you will support me and my project. Thank you so much for the support! 🙏
Рекомендации по теме
Комментарии
Автор

Let me know if you like this wrapper framework or not :)

patloeber
Автор

Been using Lightning for a couple months now. ITS AWESOME

jack
Автор

Very well explained. As someone who just recently got used to PyTorch, it's nice to see Lightning allows me to skip a lot of the manual looping and optimizer/loss management. Thanks for also including the snippet about the Tensorboard, which is also very helpful instead of generating your own graphs in Matplotlib.

raymond-andrade
Автор

Finally done with your Pytorch playlist!! Your videos have gradually removed my fears about implementing neural networks and will now serve as the foundations upon which I will grow my knowledge base in AI. I recommended this series to my brother who is an ML research scientist and he's also learning a lot from it. Thank you very much for this and I hope your channel grows by leaps and bounds as the field grows.

BTW there's an update w.r.t. returning the dictionary in the validation_epoch_end() function:
UserWarning: The validation_epoch_end should not return anything as of 9.1.to log, use self.log(...) or self.write(...) directly in the LightningModule warnings.warn(*args, **kwargs)
UserWarning: The {log:dict keyword} was deprecated in 0.9.1 and will be removed in 1.0.0
Please use self.log(...) inside the lightningModule instead.

TechnGizmos
Автор

Yesss and thanks, I have been waiting for it after u mentioned last video

thantyarzarhein
Автор

Many thanks for your work on this series. It is much appreciated. One point to make about this tutorial is that the code for interfacing PyTorch Lightning with TensorBoard has changed.
Now Lightning uses Loggers and fortunately TensorBoard is its default logger.
To get your code to work on the latest TensorBoard replace, or comment out, these two lines in the training_step function:
# tensorboard_logs = {'train_loss': loss.detach()} and
# return {'loss': loss, 'log': tensorboard_logs} with the simpler
self.log("train_loss", loss ) # and then just
return loss
Make similar changes in the validation_epoch_end function usin:
self.log("val_loss": avg_loss)
return avg_loss
to replace the original two lines which used a dictionary format as shown above.
Thanks again for your wonderful work. I hope to soon see a PyTorch tutorial on
autoencoders!

timmervyn
Автор

As always, awesome tutorials. many thanks for your efforts

arshadAlieusafzay
Автор

awesome tutorials. many thanks for your efforts

mizumo
Автор

Hi, Thanks for the tutorial. I would like to know how to do the predictions logic for the trained model. Could you please share some information on it.

chakkarapaniv
Автор

Thanks a lot! Have a question: when you used 'model' instead of 'self' (the mistake you mentioned on the video), what does code exactly do? what is it understanding as 'model' in that method?

raminessalat
Автор

Thanks for making our life's so easy

monishkarunakaran
Автор

Thanx you for the perfect language that is not commonly spread along the python youtube channels. If you know what I

TotoGoTravel
Автор

Pytorch Lightning is wonderful! But I could figure out How to use "pytorch lightning data module" for a custom (extremely large ) custom dataset?

talhayousuf
Автор

is this basically keras but for pytorch? i usually make deep learning model using Tensorflow and recently picked up to learn Pytorch. as frustating as it is. i manage to make a non-mnist model. but it seem a lot more complicated than tensorflow

graceyudha
Автор

Love the idea behind it! Abstract the engineering from science.

I wonder what happens when something doesn't work though - how painful is it to debug it? How responsive are the contributors when there is a bug?
Also, I'd love to know the perf overhead, etc. (I guess it can't be too big and the framework is probably geared towards rapid prototyping and not production either way)

All of those would have to have a satisfying answer for me to wish to switch to PyTorch Lightning (PyTorch user here).

TheAIEpiphany
Автор

Excellent tutorial! Thank you so much! I have a question in github code file lines 41 & 76: outputs = self(images). Should it be outputs = self.forward(images)?

yuanjizhang
Автор

does PyTorch Lightning work with Mac M2 chips?

wryltxw
Автор

Can you also make some for the transformers

mohd.faizan
Автор

Hey there, is it necessary to be good at Pytorch so that I can learn Pytorch Lightening? I am a beginner in Pytorch.

janmejaybhoi
Автор

New import command due to update of lightning is "import lightning as pl"

anonim