A conditional gradient framework for composite convex minimization with applications to semidefinite programming

Alp Yurtsever, Olivier Fercoq, Francesco Locatello, Volkan Cevher

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

We propose a conditional gradient framework for a composite convex minimization template with broad applications. Our approach combines smoothing and homotopy techniques under the CGM framework, and provably achieves the optimal O(1/√k) convergence rate. We demonstrate that the same rate holds if the linear subproblems are solved approximately with additive or multiplicative error. In contrast with the relevant work, we are able to characterize the convergence when the non-smooth term is an indicator function. Specific applications of our framework include the non-smooth minimization, semidefinite programming, and minimization with linear inclusion constraints over a compact domain. Numerical evidence demonstrates the benefits of our framework.

Original languageEnglish
Title of host publication35th International Conference on Machine Learning, ICML 2018
EditorsAndreas Krause, Jennifer Dy
PublisherInternational Machine Learning Society (IMLS)
Pages9096-9111
Number of pages16
ISBN (Electronic)9781510867963
Publication statusPublished - 1 Jan 2018
Externally publishedYes
Event35th International Conference on Machine Learning, ICML 2018 - Stockholm, Sweden
Duration: 10 Jul 201815 Jul 2018

Publication series

Name35th International Conference on Machine Learning, ICML 2018
Volume13

Conference

Conference35th International Conference on Machine Learning, ICML 2018
Country/TerritorySweden
CityStockholm
Period10/07/1815/07/18

Fingerprint

Dive into the research topics of 'A conditional gradient framework for composite convex minimization with applications to semidefinite programming'. Together they form a unique fingerprint.

Cite this