Abstract
We consider the problem of minimizing a non-convex function over a smooth manifold M. We propose a novel algorithm, the Orthogonal Directions Constrained Gradient Method (ODCGM), which only requires computing a projection onto a vector space. ODCGM is infeasible but the iterates are constantly pulled towards the manifold, ensuring the convergence of ODCGM towards M. ODCGM is much simpler to implement than the classical methods, which require the computation of a retraction. Moreover, we show that ODCGM exhibits the near-optimal oracle complexities Op1{ε2q and Op1{ε4q in the deterministic and stochastic cases, respectively. Furthermore, we establish that, under an appropriate choice of the projection metric, our method recovers the landing algorithm of Ablin and Peyré (2022), a recently introduced algorithm for optimization over the Stiefel manifold. As a result, we significantly extend the analysis of Ablin and Peyré (2022), establishing near-optimal rates both in deterministic and stochastic frameworks. Finally, we perform numerical experiments, which shows the efficiency of ODCGM in a high-dimensional setting.
| Original language | English |
|---|---|
| Pages (from-to) | 1228-1258 |
| Number of pages | 31 |
| Journal | Proceedings of Machine Learning Research |
| Volume | 195 |
| Publication status | Published - 1 Jan 2023 |
| Event | 36th Annual Conference on Learning Theory, COLT 2023 - Bangalore, India Duration: 12 Jul 2023 → 15 Jul 2023 |
Keywords
- Riemannian optimization
- Stiefel manifold
- constrained optimization
- non-convex optimization
- stochastic optimization