Regularization - Explained!

preview_player
Показать описание
We will explain Ridge, Lasso and a Bayesian interpretation of both.

ABOUT ME

RESOURCES

MATH COURSES (7 day free trial)

OTHER RELATED COURSES (7 day free trial)
Рекомендации по теме
Комментарии
Автор

Why this is so Underrated, this should be on every one playlist for linear regression.
Hatsoff man :)

ashishanand
Автор

Hi Ajay, great video, as always. One suggestion with your permission;) I think it might be worthwhile introducing the concept of regularization by comparing:

Feature elimination ( which is equivalent to making the weight zero) vs reducing the weight ( which is regularization) and elaborate on this and then drfting towards Lasso and, Ridge. ;)

ajaytaneja
Автор

I had to watch it twice to truly digest your approach, but I like your approach to the contour plot in particular. I hope to boost your channel with my comments a tiny bit ;). tyvm!
what I was taught and what is helpful to know imo:
1) Speaking on an abstract level what regularization achieves: it punishes high-dimensional terms.
2) The notion of L1- and L2- regularization and when you talk about "Gaussian" for Ridge, you could also talk about "Laplace" distribution instead of double exponential distribution for Lasso regression

paull
Автор

Thank you very much for this answer, I have been looking for it for a while: 7:42

ivanalejandrogarciaramirez
Автор

Excellent videos! Great graphing for intuition of L1 regularization where parameters become exactly zero (9:45) as compared with behavior of L2 regularization.

blairnicolle
Автор

Well hello everyone right back at you Ajay! These are fire, the live viz is on point!

NicholasRenotte
Автор

Such an awesome video! Can't believe i hadn't made the connection between ridge and Lagrangians, literally has a lambda in it lol!

cormackjackson
Автор

Nice explaination of Bayesian.
Isn't Regularization just the Lagrange multiplier. The optimum point is where the the gradient of the constraint is proportional to the gradient of the cost function.

chadx
Автор

always find interesting things here, Keep going .Good luck .

sivakrishna
Автор

Nice video, thanks! The only thing I think is slightly incorrect is that you could see polynomials with increasing degrees as complex. Since you are talking about maths, I was expecting to see imaginary unit when I first heard complex.

Автор

Great content on your channel. I just found it! Heh i used desmos to debug/visualize too!

I just added a video explaining easy multilayer back propogation. The book math with all the subscripts is confusing, so i did it without any. Much simpler to understand.

TheRainHarvester
Автор

Love your awesome videos! Salute! Thank you so much!

fujinzhou
Автор

How does Gauss-Newton for nonlinear regression change with (L2) regularization?

alexandergeorgiev