Inexact subgradient methods for semialgebraic functions

Jérôme Bolte, Tam Le, Eric Moulines, Edouard Pauwels

Research output: Contribution to journalArticlepeer-review

Abstract

Motivated by the extensive application of approximate gradients in machine learning and optimization, we investigate inexact subgradient methods subject to persistent additive errors. Within a nonconvex semialgebraic framework, assuming boundedness or coercivity, we establish that the method yields iterates that eventually fluctuate near the critical set at a proximity characterized by an O(ϵρ) distance, where ϵ denotes the magnitude of subgradient evaluation errors, and ρ encapsulates geometric characteristics of the underlying problem. Our analysis comprehensively addresses both vanishing and constant step-size regimes. Notably, the latter regime inherently enlarges the fluctuation region, yet this enlargement remains on the order of ϵρ. In the convex scenario, employing a universal error bound applicable to coercive semialgebraic functions, we derive novel complexity results concerning averaged iterates. Additionally, our study produces auxiliary results of independent interest, including descent-type lemmas for nonsmooth nonconvex functions and an invariance principle governing the behavior of algorithmic sequences under small-step limits.

Original languageEnglish
JournalMathematical Programming
DOIs
Publication statusAccepted/In press - 1 Jan 2025
Externally publishedYes

Keywords

  • Clarke subdifferential
  • First-order methods
  • Inexact subgradient
  • Nonsmooth nonconvex optimization
  • Path differentiable functions
  • Semialgebraic functions

Fingerprint

Dive into the research topics of 'Inexact subgradient methods for semialgebraic functions'. Together they form a unique fingerprint.

Cite this