Convergence rates and approximation results for SGD and its continuous-time counterpart

Research output: Contribution to journalConference articlepeer-review

Abstract

This paper proposes a thorough theoretical analysis of Stochastic Gradient Descent (SGD) with non-increasing step sizes. First, we show that the recursion defining SGD can be provably approximated by solutions of a time inhomogeneous Stochastic Differential Equation (SDE) using an appropriate coupling. In the specific case of a batch noise we refine our results using recent advances in Stein’s method. Then, motivated by recent analyses of deterministic and stochastic optimization methods by their continuous counterpart, we study the long-time behavior of the continuous processes at hand and establish non-asymptotic bounds. To that purpose, we develop new comparison techniques which are of independent interest. Adapting these techniques to the discrete setting, we show that the same results hold for the corresponding SGD sequences. In our analysis, we notably improve non-asymptotic bounds in the convex setting for SGD under weaker assumptions than the ones considered in previous works. Finally, we also establish finite-time convergence results under various conditions, including relaxations of the famous Łojasiewicz inequality, which can be applied to a class of non-convex functions.

Original languageEnglish
Pages (from-to)1965-2058
Number of pages94
JournalProceedings of Machine Learning Research
Volume134
Publication statusPublished - 1 Jan 2021
Externally publishedYes
Event34th Conference on Learning Theory, COLT 2021 - Boulder, United States
Duration: 15 Aug 202119 Aug 2021

Keywords

  • Stochastic Differential Equations
  • Stochastic Gradient Descent
  • approximation results
  • convergence rates

Fingerprint

Dive into the research topics of 'Convergence rates and approximation results for SGD and its continuous-time counterpart'. Together they form a unique fingerprint.

Cite this