filmov
tv
Optimization Techniques J PELFORT

Показать описание
Min f = 100 * [ y^2*(3- x) - x^2*(3+ x ) ] ^2 + (2+ x )^2 / (1+ (2+ x )^2 )
Minima found at x= -2 , y = +/- 0.89442719 ;
This Function was proposed by J.Stoer and R.Bulirsch " Introduction to Numerical Analisys" Springer Berlag (1980) on Page 357.
I only employ line search methods to find the length of the step size , but there are many others
including random search and of course the Trust Region methods wich do not appear in this video.
Derivatives in this routine are constructed via forward finiete differences (and not employing the analytical ones) . When functions are non convex to get rid of saddle points you'll need a feasible direction following the negative eigenvalues of the e Hessian mtx.
Bibliography that I personally recommend :
M.R.Hestenes E.L. Stiefel " Methods of Conjugate Gradients for Solving Linear Systems" Journal of Research Bureau of National Standards Section B Vol 49. Pages 409 onwards (1952)
R.Fletcher C.M. Reeves " Function Minimization by Conjugate Gradients" British Computer Journal Vol 7 pages 149-154 (1964)
R.Fletcher " A New Approach to Variable Metric Algorithms" British computer Jouranl Vol 13. pages 317 onwards (1970)
W.C. Davidon "Variable Metric Method for Minimization " A.E.C. Research Report ANL 5990 pages 21 onwards (1959)
M.J.D. Powell " A Rapidly Convergent Descent Method for Minimization " British Computer J. Vol 6 pages 163 onwards (1963)
E.Polak ,G.Ribière " Sur la métode de D.F.P. pour la minimisation des fonctions " Management Science Review vol 16 pages 572 onwards (1970)
R.Hooke T.A. Jeeves "Direct Search Solution of Numerical and Statistical Problems" J.Association Computer Machinery 8 pp 212-229 (1961) see also Accelerated Steepest Descent of Nesterov.
Minima found at x= -2 , y = +/- 0.89442719 ;
This Function was proposed by J.Stoer and R.Bulirsch " Introduction to Numerical Analisys" Springer Berlag (1980) on Page 357.
I only employ line search methods to find the length of the step size , but there are many others
including random search and of course the Trust Region methods wich do not appear in this video.
Derivatives in this routine are constructed via forward finiete differences (and not employing the analytical ones) . When functions are non convex to get rid of saddle points you'll need a feasible direction following the negative eigenvalues of the e Hessian mtx.
Bibliography that I personally recommend :
M.R.Hestenes E.L. Stiefel " Methods of Conjugate Gradients for Solving Linear Systems" Journal of Research Bureau of National Standards Section B Vol 49. Pages 409 onwards (1952)
R.Fletcher C.M. Reeves " Function Minimization by Conjugate Gradients" British Computer Journal Vol 7 pages 149-154 (1964)
R.Fletcher " A New Approach to Variable Metric Algorithms" British computer Jouranl Vol 13. pages 317 onwards (1970)
W.C. Davidon "Variable Metric Method for Minimization " A.E.C. Research Report ANL 5990 pages 21 onwards (1959)
M.J.D. Powell " A Rapidly Convergent Descent Method for Minimization " British Computer J. Vol 6 pages 163 onwards (1963)
E.Polak ,G.Ribière " Sur la métode de D.F.P. pour la minimisation des fonctions " Management Science Review vol 16 pages 572 onwards (1970)
R.Hooke T.A. Jeeves "Direct Search Solution of Numerical and Statistical Problems" J.Association Computer Machinery 8 pp 212-229 (1961) see also Accelerated Steepest Descent of Nesterov.