TY - JOUR
T1 - Inexact subgradient methods for semialgebraic functions
AU - Bolte, Jérôme
AU - Le, Tam
AU - Moulines, Eric
AU - Pauwels, Edouard
N1 - Publisher Copyright:
© Springer-Verlag GmbH Germany, part of Springer Nature and Mathematical Optimization Society 2025.
PY - 2025/1/1
Y1 - 2025/1/1
N2 - Motivated by the extensive application of approximate gradients in machine learning and optimization, we investigate inexact subgradient methods subject to persistent additive errors. Within a nonconvex semialgebraic framework, assuming boundedness or coercivity, we establish that the method yields iterates that eventually fluctuate near the critical set at a proximity characterized by an O(ϵρ) distance, where ϵ denotes the magnitude of subgradient evaluation errors, and ρ encapsulates geometric characteristics of the underlying problem. Our analysis comprehensively addresses both vanishing and constant step-size regimes. Notably, the latter regime inherently enlarges the fluctuation region, yet this enlargement remains on the order of ϵρ. In the convex scenario, employing a universal error bound applicable to coercive semialgebraic functions, we derive novel complexity results concerning averaged iterates. Additionally, our study produces auxiliary results of independent interest, including descent-type lemmas for nonsmooth nonconvex functions and an invariance principle governing the behavior of algorithmic sequences under small-step limits.
AB - Motivated by the extensive application of approximate gradients in machine learning and optimization, we investigate inexact subgradient methods subject to persistent additive errors. Within a nonconvex semialgebraic framework, assuming boundedness or coercivity, we establish that the method yields iterates that eventually fluctuate near the critical set at a proximity characterized by an O(ϵρ) distance, where ϵ denotes the magnitude of subgradient evaluation errors, and ρ encapsulates geometric characteristics of the underlying problem. Our analysis comprehensively addresses both vanishing and constant step-size regimes. Notably, the latter regime inherently enlarges the fluctuation region, yet this enlargement remains on the order of ϵρ. In the convex scenario, employing a universal error bound applicable to coercive semialgebraic functions, we derive novel complexity results concerning averaged iterates. Additionally, our study produces auxiliary results of independent interest, including descent-type lemmas for nonsmooth nonconvex functions and an invariance principle governing the behavior of algorithmic sequences under small-step limits.
KW - Clarke subdifferential
KW - First-order methods
KW - Inexact subgradient
KW - Nonsmooth nonconvex optimization
KW - Path differentiable functions
KW - Semialgebraic functions
UR - https://www.scopus.com/pages/publications/105008674810
U2 - 10.1007/s10107-025-02245-w
DO - 10.1007/s10107-025-02245-w
M3 - Article
AN - SCOPUS:105008674810
SN - 0025-5610
JO - Mathematical Programming
JF - Mathematical Programming
ER -