filmov
tv
Intro to AI Lab 5: Optimizers
![preview_player](https://i.ytimg.com/vi/I5d7DfBk55Q/sddefault.jpg)
Показать описание
In this LAB session of the "Introduction to AI: Neural Nets, LLMs, and Gen AI" course, we cover the following:
- Learning rate schedulers and optimizers
- The session will primarily involve testing different combinations on a neural network trained on the California housing dataset, a regression task to predict housing prices.
- The speaker explains the importance of batch size, data pre-processing, and how schedulers like cosine annealing work
- They emphasize the need for understanding how different schedulers, such as ExponentialLR and StepLR, modify the learning rate during training
- Explain the purpose of using different optimizers (e.g., Adam, SGD) and schedulers, and how each combination affects the model's performance.
- Learning rate schedulers and optimizers
- The session will primarily involve testing different combinations on a neural network trained on the California housing dataset, a regression task to predict housing prices.
- The speaker explains the importance of batch size, data pre-processing, and how schedulers like cosine annealing work
- They emphasize the need for understanding how different schedulers, such as ExponentialLR and StepLR, modify the learning rate during training
- Explain the purpose of using different optimizers (e.g., Adam, SGD) and schedulers, and how each combination affects the model's performance.