ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2303.09261
14
6

Orthogonal Directions Constrained Gradient Method: from non-linear equality constraints to Stiefel manifold

16 March 2023
S. Schechtman
D. Tiapkin
Michael Muehlebach
Eric Moulines
ArXivPDFHTML
Abstract

We consider the problem of minimizing a non-convex function over a smooth manifold M\mathcal{M}M. We propose a novel algorithm, the Orthogonal Directions Constrained Gradient Method (ODCGM) which only requires computing a projection onto a vector space. ODCGM is infeasible but the iterates are constantly pulled towards the manifold, ensuring the convergence of ODCGM towards M\mathcal{M}M. ODCGM is much simpler to implement than the classical methods which require the computation of a retraction. Moreover, we show that ODCGM exhibits the near-optimal oracle complexities O(1/ε2)\mathcal{O}(1/\varepsilon^2)O(1/ε2) and O(1/ε4)\mathcal{O}(1/\varepsilon^4)O(1/ε4) in the deterministic and stochastic cases, respectively. Furthermore, we establish that, under an appropriate choice of the projection metric, our method recovers the landing algorithm of Ablin and Peyr\é (2022), a recently introduced algorithm for optimization over the Stiefel manifold. As a result, we significantly extend the analysis of Ablin and Peyr\é (2022), establishing near-optimal rates both in deterministic and stochastic frameworks. Finally, we perform numerical experiments which shows the efficiency of ODCGM in a high-dimensional setting.

View on arXiv
Comments on this paper