mlcourse.ai. Lecture 6. Part 1. Linear regression. Theory

preview_player
Показать описание
Here we cover linear regression from Machine Learning perspective. We also cover a fundamental ML technique called regularization.

Рекомендации по теме
Комментарии
Автор

Known issues:
- framing, sometimes the bottom is not visible (though it's not crucial for understanding the material)
- 33:25: The sign between w_0 and w_1 shall be minus, not plus
- lost multiplier 2 in 45:40

festline
Автор

I might be wrong, but shouldn't the derivative at 45:20 be ∑ 2(yi-w0-w1xi) * (-1)?? Pls tell if I am wrong

arnavrajurkar
Автор

Is there a relationship between SVD decomposed matrix( USU*) S i.e diagonal matrix and regularization?
The way you described the 1:09:45. The matrix is similar to S ?

vijayendrasdm
Автор

Thanks for amazing lecture. I have one question though.

Why does L1 regularization drag feature weights to ZERO while L2 doesn't?

vijayendrasdm
Автор

Isn't there a mistake at 33:25? Shouldn't be there y - w0 - WnXi instead of y - w0 + WnXi ?

gypsysound_