Week-10 | TA Session-1

preview_player
Показать описание

Рекомендации по теме
Комментарии
Автор

11:28 — 15:57

(Week 9) Perceptron when not linearly separable: using kernel trick

\\\

26:56 — 39:22

Expression for pair of separators with non-zero margin

8:12 — 11:08

Width between separators is inversely related to length of the weight vector

• AQ 10.2 Q2, Q5

20:00 — 21:42

Vary `w` to achieve desired width between separators
(only decision variable is `w`)

58:29 — 1:02:03

Difference between gamma, and width between separators:
grounding gamma = 1

\\\

39:46 — 50:02

Motivating and setting up the optimization problem for SVM

• AQ 10.2 Q6

50:03 — 58:27

Method of Lagrange: convert a constrained optimization problem, into an equivalent unconstrained optimization problem viz. the Lagrangian

Duality

• 1:12:32 — 1:14:06 AQ 10.4 Q2

2:00:01 — 2:04:03

Solving the Lagrangian

1:14:10 — 1:16:55

w* is a linear combination of the datapoints, with the weights given by α*.

2:04:05 — 2:12:29

KKT conditions > Complementary slackness

Either the constraint is balanced, or the Lagrange multiplier on it must be 0 (for optimal).

2:12:32 — 2:16:14

Interpreting complementary slackness for SVM:

Either a datapoint is on a separator (decision boundary), or it necessarily gets weight = 0 in the calculation of w*.

Datapoints on a separator are called 'support vectors'.

• 1:17:09 — 1:23:28 AQ10.5 Q6

\\\

1:24:14 — 1:28:59

Early primer on soft-margin SVM

\\\

1:44:41 — 1:55:20

Open discussion: Special status of eigenvectors, covariance matrix

\\\

• Please watch at 1.25x or 1.5x speed as suitable.

Thanks for watching. 🙏

shubharupG