Machine Learning Foundations/Techniques: Regularization

preview_player
Показать описание

Рекомендации по теме
Комментарии
Автор

5:55 Start
15:46 Regularization
18:50 Stepping Back as Constraint
20:49 Regression with Constraint
22:49 Regression with Looser Contraint
26:59 Regression with Softer Contraint
30:39 Teacher Explanation
34:54 QA
45:24 Matrix Form of Regularized Regression Problem
47:00 The Lagrange Multiplier
52:26 Augmented Error
59:16 The Results
1:01:07 Some Detail: Legendre Polynomials
1:04:06 Teacher Explanation (The Lagrange Multiplier)
1:11:55 QA
1:32:56 Regularization and VC Theory

meng-ruwu
Автор

00:05:31 开始
00:08:52 回顾上次课
00:15:31 (Lecture14 Regularization)
00:16:28 overfit -> regularized fit
00:18:39 how to step back = constraint
00:20:49 step back = constraint optimization of E_in
00:22:49 looser constraint: sparse hypothesis set (NP-hard)
00:26:59 softer constraint: regularized hypothesis set
00:30:41 老师补充
00:34:33 QA
00:34:53
00:37:53 老师分享
00:42:15 线上QA(比较hypotheses)
00:44:27 线上QA(constraint的简化)
00:45:25 (Lecture14) matrix form of regularization regression
00:47:01 lagrange multiplier
00:52:27 ridge regression
00:55:33 augmented error
00:59:17 weight-decay regularization: large λ => prefer shorter w => effectively smaller C
01:01:07 legendre polynomials
01:04:06 老师补充
01:11:40 QA
01:11:55 现场QA(λ和C的关系)
01:15:37 现场QA(legendre polynomials)
01:22:52 现场QA(C和λ怎么决定)

glhuang