Lesson 19: Deep Learning Foundations to Stable Diffusion

preview_player
Показать описание

0:00:00 - Introduction and quick update from last lesson
0:02:08 - Dropout
0:12:07 - DDPM from scratch - Paper and math
0:40:17 - DDPM - The code
0:41:16 - U-Net Neural Network
0:43:41 - Training process
0:56:07 - Inheriting from miniai TrainCB
1:00:22 - Using the trained model: denoising with “sample” method
1:09:09 - Inference: generating some images
1:14:56 - Notebook 17: Jeremy’s exploration of Tanishq’s notebook
1:24:09 - Make it faster: Initialization
1:27:41 - Make it faster: Mixed Precision
1:29:40 - Change of plans: Mixed Precision goes to Lesson 20

Many thanks to Francisco Mussari for timestamps and transcription.
Рекомендации по теме
Комментарии
Автор

Thank you Jeremy, Tanisqh and Jono I appreciate you guys for this lesson

giorda
Автор

Greek letters for identifiers are a pointless distraction imo - the alternative isn't spelled out greek letter names, but meaningful names. "For coders" ...

rjScubaSki
Автор

Why is the coding implementation for sampling of x_t-1 is different @1:06:25 from the algorithm 2 for sampling mentioned in the paper @53:50 ??

timandersen
Автор

Does the course dive deeper into the architecture of the Unet model. I feel there's a lot of intricacies we are missing out on there.

deepschoolai
Автор

46:03 I might be wrong here, but I don't think sigma is sqrt of beta. It is the square root of beta tilde which is NOT the value that has linearly spaced values. Getting this from section 3.2 of the DDPM paper.

deepschoolai
Автор

I cannot find DDPM notebook in diffusion-nbs repo? Can somebody past the link for the same?

SandeepSinghPlus