filmov
tv
Edouard Pauwels - Inexact subgradient algorithm with errors for semialgebraic functions

Показать описание
This talk was part of the Workshop on "One World Optimization Seminar in Vienna" held at the ESI June 3 -- 7, 2024.
Motivated by the widespread use of approximate derivatives in machine learning and optimization, we study inexact subgradient methods with non-vanishing additive errors. In the nonconvex semialgebraic setting, under boundedness or coercivity assumptions, we prove that the method provides points that eventually fluctuate close to the critical set, in relation to the geometry of the problem and the magnitud of the errors. We cover two step size regimes: vanishing step sizes and small constant step sizes. The main technique relates to the ODE method, and we obtain as byproducts of our analysis, a descent-like lemma for nonsmooth nonconvex problems and an invariance result for the small step limits of algorithmic squences.
Motivated by the widespread use of approximate derivatives in machine learning and optimization, we study inexact subgradient methods with non-vanishing additive errors. In the nonconvex semialgebraic setting, under boundedness or coercivity assumptions, we prove that the method provides points that eventually fluctuate close to the critical set, in relation to the geometry of the problem and the magnitud of the errors. We cover two step size regimes: vanishing step sizes and small constant step sizes. The main technique relates to the ODE method, and we obtain as byproducts of our analysis, a descent-like lemma for nonsmooth nonconvex problems and an invariance result for the small step limits of algorithmic squences.