Almost surely constrained convex optimization

Olivier Fercoq, Ahmet Alacaoglu, Ion Necoara, Volkan Cevher

Research output: Contribution to journalConference articlepeer-review

Abstract

We propose a stochastic gradient framework for solving stochastic composite convex optimization problems with (possibly) infinite number of linear inclusion constraints that need to be satisfied almost surely. We use smoothing and homotopy techniques to handle constraints without the need for matrix-valued projections. We show for our stochastic gradient algorithm O(log(k)/k) convergence rate for general convex objectives and O(log(k)/k) convergence rate for restricted strongly convex objectives. These rates are known to be optimal up to logarithmic factor, even without constraints. We conduct numerical experiments on basis pursuit, hard margin support vector machines and portfolio optimization problems and show that our algorithm achieves state-of-theart practical performance.

Original languageEnglish
Pages (from-to)1910-1919
Number of pages10
JournalProceedings of Machine Learning Research
Volume97
Publication statusPublished - 1 Jan 2019
Externally publishedYes
Event36th International Conference on Machine Learning, ICML 2019 - Long Beach, United States
Duration: 9 Jun 201915 Jun 2019

Fingerprint

Dive into the research topics of 'Almost surely constrained convex optimization'. Together they form a unique fingerprint.

Cite this