12-a LFD: Noise and regularization in a nutshell: constrain the model.

preview_player
Показать описание
Machine Learning From Data, Rensselaer Fall 2020.

Professor Malik Magdon-Ismail talks about regularization, a tool to combat overfitting. We link constrained models to better generalization. Then, within the context of the concrete linear regression model, we derive the popular weight-decay regularizer from a budget constrained hypothesis set. The pseudoinverse algorithm for regression becomes the regularized pseudoinverse algorithm for regularized regression. We show experiments comparing the drastic impact of a little regularization versus no regularization, especially when there is noise (stochastic or deterministic).

This is the twelfth lecture in a "theory" course focusing on the foundations of learning, as well as some of the more advanced techniques like support vector machines and neural (deep) networks that are used in practice.

Level of the course: Advanced undergraduate, beginning graduate. Knowledge of probability, linear algebra, and calculus is helpful.

Рекомендации по теме