9.1: L1 and L2 Regularization with Keras and TensorFlow (Module 9, Part 1)

preview_player
Показать описание
Using L1 (ridge) and L2 (lasso) regression with scikit-learn. This introduction to linear regression regularization lays the foundation to understanding L1/L2 in Keras. This video is part of a course that is taught in a hybrid format at Washington University in St. Louis; however, all the information is online and you can easily follow along. T81-558: Application of Deep Learning, at Washington University in St. Louis

Please subscribe and comment!

Follow me:

More links:
Рекомендации по теме
Комментарии
Автор

This video helped me a lot. Can't believe no one has commented. Thanks for an accurate and concise lesson!

alish
Автор

really good walk through. appreciate these videos!

chocolmilkman
Автор

I really appreciate your work.. so clear and easy to follow. Great lecture thanks..

karthikreddysangala
Автор

This was a great introduction, subscribed!

Guytothemillionth
Автор

Great Videos ! Thanks ! There is something that I'd like to confirm. The sklearn linear regression is a single layer neural network ? Which is not the case of our neural networks we usually build in your classes, correct ? This means that to use your report_coef to remove useless inputs, we should work on a single layer ? I'm a little confused with this part... Thanks

samylebrave