Harvard AM205 video 4.9 - Quasi-Newton methods

preview_player
Показать описание
Harvard Applied Math 205 is a graduate-level course on scientific computing and numerical methods. The previous video in this series discussed using the Newton method to find local minima of a function; while this method can be highly efficient, it requires the exact Hessian of the function in order to work, which can be expensive and laborious to compute in some cases.

Here we discuss quasi-Newton methods that do not require the exact Hessian. In particular the Brodyen–Fletcher–Goldfarb–Shanno (BFGS) method is discussed and presented in detail. When describing the mathematics behind the BFGS method, the related Broyden root-finding algorithm is also discussed.

Рекомендации по теме
Комментарии
Автор

Excellent video, it really helps me for understanding the quasi Newton method, thank you very much!

dwyanechou
Автор

very nice! I have a question. I think you can answer it.
Do you know what is the objective function of SR1?
When I solve min ||Jk-Jk-1||_F (as you explained on 7:17 ) s.t. satisfies the secant equation, then, I can get Broyden's solution.
But, If I solve ||Jk-Jk-1||_F s.t. satisfies the secant equation and update matrix is rank 1 and symmetry, I can get a solution, but it is different from SR1.

hyukppen
Автор

Very nice video
Keep up the good work :)

fabianbirringer
Автор

Excellent Video, I have a question, you discussed both Broyden method and BFGS, the Broyden méthode requires no explicit knowledge of gradient(deltaF), so it looks simpler when exact gradient is not available and numerical gradient is not cheap, my question is, does it mean Broyden method converge slower than bfgs? Or what’s its pros and cons compared to bfgs

vivy