ON THE CONVERGENCE OF STOCHASTIC PRIMAL-DUAL HYBRID GRADIENT

Research output: Contribution to journalArticlepeer-review

Abstract

In this paper, we analyze the recently proposed stochastic primal-dual hybrid gradient (SPDHG) algorithm and provide new theoretical results. In particular, we prove almost sure convergence of the iterates to a solution with convexity and linear convergence with further structure, using standard step sizes independent of strong convexity or other regularity constants. In the general convex case, we also prove the O(1/k) convergence rate for the ergodic sequence, on expected primal-dual gap function. Our assumption for linear convergence is metric subregularity, which is satisfied for strongly convex-strongly concave problems in addition to many nonsmooth and/or nonstrongly convex problems, such as linear programs, Lasso, and support vector machines. We also provide numerical evidence showing that SPDHG with standard step sizes shows a competitive practical performance against its specialized strongly convex variant SPDHG-μ and other state-of-the-art algorithms, including variance reduction methods.

Original languageEnglish
Pages (from-to)1288-1318
Number of pages31
JournalSIAM Journal on Optimization
Volume32
Issue number2
DOIs
Publication statusPublished - 1 Jan 2022

Keywords

  • convex optimization
  • coordinate descent
  • primal-dual algorithms

Fingerprint

Dive into the research topics of 'ON THE CONVERGENCE OF STOCHASTIC PRIMAL-DUAL HYBRID GRADIENT'. Together they form a unique fingerprint.

Cite this