Lesson 14: Deep Learning Foundations to Stable Diffusion

preview_player
Показать описание

We look at how to map the code from the previous lesson to the math behind backpropagation. Next, we refactor our code using PyTorch's `nn.Module`, which automatically tracks layers and parameters. We also create a sequential model using `nn.Sequential` and demonstrate how to create custom PyTorch modules. We then introduce the concept of an optimizer, which simplifies the process of updating parameters based on gradients and learning rates. We create a custom SGD optimizer from scratch and explore PyTorch's built-in DataLoader. We also create a proper training loop using PyTorch DataLoader.

Throughout the lesson, we emphasize the importance of understanding the underlying code and not relying solely on other people's code. This allows for greater flexibility and creativity in building custom solutions. We also discuss the use of `**kwargs` and delegates in fastcore, callbacks, and dunder methods in Python's data model.

0:00:00 - Introduction
0:00:30 - Review of code and math from Lesson 13
0:07:40 - f-Strings
0:10:00 - Re-running the Notebook - Run All Above
0:12:48 - Generator Object
0:13:26 - Class MLP: Inheriting from nn.Module
0:17:03 - Checking the more flexible refactored MLP
0:17:53 - Creating our own nn.Module
0:21:38 - Using PyTorch’s nn.Module
0:23:51 - Using PyTorch’s nn.ModuleList
0:24:59 - reduce()
0:26:49 - PyThorch’s nn.Sequential
0:27:35 - Optimizer
0:29:37 - PyTorch’ optim and get_model()
0:30:04 - Dataset
0:33:29 - DataLoader
0:35:53 - Random sampling, batch size, collation
0:40:59 - What does collate do?
0:45:17 - fastcore’s store_attr()
0:46:07 - Multiprocessing DataLoader
0:50:36 - PyTorch’s Multiprocessing DataLoader
0:53:55 - Validation set
0:56:11 - Hugging Face Datasets, Fashion-MNIST
1:01:55 - collate function
1:04:41 - transforms function
1:06:47 - decorators
1:09:42 - itemgetter
1:11:55 - PyTorch’s default_collate
1:15:38 - Creating a Python library with nbdev
1:18:53 - Plotting images
1:21:14 - **kwargs and fastcore’s delegates
1:28:03 - Computer Science concepts with Python: callbacks
1:33:40 - Lambdas and partials
1:36:26 - Callbacks as callable classes
1:37:58 - Multiple callback funcs; *args and **kwargs
1:43:15 - __dunder__ thingies
1:47:33 - Wrap-up

Рекомендации по теме
Комментарии
Автор

This goes beyond ML. This is fantastic for just knowing how to program and overall CS in general. Thank you for this work

haldanesghost
Автор

Great, very useful tipps, thank you very much!

michaelmuller
Автор

Q: why not just make bs=len(valid_ds), i.e. make the batch size for the validation set the same as its length? I can't see a function in having batches of the validation set, since we're just computing some metric on it?

mchristos
Автор

Great explanation. BTW where is the link for part 1 series?

diodin
visit shbcf.ru