Stochastic Subgradient Descent Escapes Active Strict Saddles on Weakly Convex Functions

Research output: Contribution to journalArticlepeer-review

Abstract

In nonsmooth stochastic optimization, we establish the nonconvergence of the stochastic subgradient descent (SGD) to the critical points recently called active strict saddles by Davis and Drusvyatskiy. Such points lie on a manifold M, where the function f has a direction of second-order negative curvature. Off this manifold, the norm of the Clarke subdifferential of f is lower-bounded. We require two conditions on f. The first assumption is a Verdier stratification condition, which is a refinement of the popular Whitney stratification. It allows us to establish a strengthened version of the projection formula of Bolte et al. for Whitney stratifiable functions and which is of independent interest. The second assumption, termed the angle condition, allows us to control the distance of the iterates to M. When f is weakly convex, our assumptions are generic. Consequently, generically, in the class of definable weakly convex functions, SGD converges to a local minimizer.

Original languageEnglish
Pages (from-to)1761-1790
Number of pages30
JournalMathematics of Operations Research
Volume49
Issue number3
DOIs
Publication statusPublished - 1 Aug 2024

Keywords

  • Clarke subdifferential
  • avoidance of traps
  • nonsmooth optimization
  • stochastic gradient descent
  • stratification
  • weakly convex

Fingerprint

Dive into the research topics of 'Stochastic Subgradient Descent Escapes Active Strict Saddles on Weakly Convex Functions'. Together they form a unique fingerprint.

Cite this