Optimization Techniques - W23 - Lecture 8 (Proximal Methods, Newton's Method, Interior-Point Method)

preview_player
Показать описание
The course "Optimization Techniques" (ENGG*6140, section 2) at the School of Engineering at the University of Guelph. Instructor: Benyamin Ghojogh
Lecture 8 continues the first-order optimization methods by introducing the proximal methods and then goes to second-order optimization. In the family of proximal methods, we cover proximal mapping (proximal operator), the Moreau-Yosida regularization (Moreau envelope), Moreau decomposition, indicator function, soft and hard penalty, projection onto set, Moreau decomposition for norms, proximal mapping of l2 norm and l1 norm, soft-thresholding, proximal point algorithm (proximal minimization), composite optimization, proximal gradient method (proximal gradient descent), projected gradient method (also called gradient projection method or projected gradient descent), singular value decomposition and projection onto the cone of orthogonal matrices, and projection onto convex sets (POCS). Then, we shift gear to the second-order optimization. We start with Newton-Raphson root finding method and derive univariate Newton's method from it. We extend it to multivariate Newton's method. We justify the Newton's method for unconstrained optimization. Then, we cover the Newton's method for optimization with equality constraints. Afterwards, we explain the interior-point method, also called the barrier methods, Unconstrained Minimization
Technique (UMT), or Sequential UMT (SUMT), for general optimization with both equality and inequality constraints. We specifically introduce the logarithmic barrier (log barrier) as an example of barrier methods. Finally, we talk about the accuracy of the log barrier method.
Some part of this lecture was inspired by the lectures of Prof. Kimon Fountoulakis at the University of Waterloo.

Chapters:

0:00 - Start of the class
1:14 - Proximal mapping (proximal operator)
8:14 - Moreau-Yosida regularization (Moreau envelope)
9:59 - Moreau decomposition
12:12 - Indicator function and soft/hard penalty
30:51 - Projection onto set
37:03 - Moreau decomposition for norm
40:38 - Proximal mapping of the zero function
42:44 - Proximal mapping of l2 norm
49:51 - Soft-thresholding and proximal mapping of l1 norm
58:13 - Regularized optimization with norm (l2 norm and l1 norm (lasso))
1:10:32 - Proximal point algorithm (proximal minimization)
1:15:47 - Composite optimization & regularized optimization
1:16:52 - Proximal gradient method (proximal gradient descent)
1:24:58 - Projected gradient method (projected gradient descent)
1:31:07 - Projection onto cone of orthogonal matrices
1:41:05 - Projection onto convex sets (POCS)
1:46:37 - Acknowledgment
1:47:52 - References for the first-order methods
1:57:52 - Question about assignment 2
1:56:36 - Reason of the name "proximal" in proximal mapping
2:00:45 - Newton-Raphson root finding method
2:03:37 - Univariate Newton's method
2:06:12 - Multivariate Newton's method
2:07:36 - Newton's method for unconstrained optimization
2:11:32 - Newton's method for optimization with equality constraints
2:18:54 - Handling infeasible point in Newton's method
2:22:20 - Interior-point method (barrier methods) for inequality constraints
2:27:39 - Log barrier (logarithmic barrier) method
2:31:55 - Accuracy of the log barrier method
Рекомендации по теме
Комментарии
Автор

In interior point method with log barrier, if the initial point does not satisfy inequality constraints, the argument of the logarithm will be negative. Therefore, you cannot calculate gradient as the function is not defined at that point. What should we do in this situation?

AliArjomandi-fxev