Optimization Techniques - W2023 - Lecture 6 (KKT Conditions & Gradient Descent)

preview_player
Показать описание
The course "Optimization Techniques" (ENGG*6140, section 2) at the School of Engineering at the University of Guelph. Instructor: Benyamin Ghojogh
Lecture 6 continues the concept of Lagrangian function and duality into the KKT (Karush-Kuhn-Tucker) conditions, i.e., stationary condition, primal feasibility, dual feasibility, and complementary slackness. Then, we explain the Method of Lagrange Multipliers. Afterwards, we start first-order optimization. In that part, we start with the fundamental theorem of calculus. Then, we introduce gradient descent and derive its step size (learning rate). Thereafter, we explain line-search, Armijo (backtracking) line-search [or Armijo-Goldstein condition], and convergence criteria.
Some part of this lecture was inspired by the lectures of Prof. Kimon Fountoulakis at the University of Waterloo.

Chapters:

0:00 - Talking about the midterm exam
14:48 - The dual variables in the interpretation of Lagrangian function
18:38 - Stationary condition in KKT conditions
25:28 - Complementary slackness in KKT conditions
35:53 - KKT conditions
39:41 - Primal and dual residuals in KKT conditions
46:31 - Max-min optimization in dual problem
52:31 - History of KKT conditions
54:49 - Method of Lagrange multipliers
1:05:43 - Example for method of Lagrange multipliers
1:19:25 - Historical papers in KKT conditions and duality
1:21:00 - Fundamental theorem of calculus
1:29:45 - Why we need numerical optimization
1:32:25 - Introduction to gradient descent
1:37:31 - The step size in gradient descent
1:47:54 - Relaxing the step size in gradient descent
2:04:34 - Iterative solution of gradient descent
2:14:17 - Line-search
2:18:19 - Gradient descent algorithm with line-search
2:21:27 - Armijo (backtracking) line-search
2:26:49 - Convergence criteria
2:32:30 - Brief overview of the next topics
Рекомендации по теме