Short Paper - Quadratic minimization: from conjugate gradient to an adaptive Polyak’s momentum method with Polyak step-sizes

Research output: Contribution to journalArticlepeer-review

Abstract

In this work, we propose an adaptive variation on the classical Heavy-ball method for convex quadratic minimization. The adaptivity crucially relies on so-called “Polyak step-sizes”, which consists of using the knowledge of the optimal value of the optimization problem at hand instead of problem parameters such as a few eigenvalues of the Hessian of the problem. This method happens to also be equivalent to a variation of the classical conjugate gradient method, and thereby inherits many of its attractive features, including its finite-time convergence, instance optimality, and its worst-case convergence rates. The classical gradient method with Polyak step-sizes is known to behave very well in situations in which it can be used, and the question of whether incorporating momentum in this method is possible and can improve the method itself appeared to be open. We provide a definitive answer to this question for minimizing convex quadratic functions, an arguably necessary first step for developing such methods in more general setups.

Original languageEnglish
Article number9
JournalOpen Journal of Mathematical Optimization
Volume5
DOIs
Publication statusPublished - 1 Jan 2024

Keywords

  • Conjugate Gradient
  • Heavy-ball
  • Optimality
  • Optimization
  • Polyak step-sizes
  • Quadratic

Fingerprint

Dive into the research topics of 'Short Paper - Quadratic minimization: from conjugate gradient to an adaptive Polyak’s momentum method with Polyak step-sizes'. Together they form a unique fingerprint.

Cite this