ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2405.15816
23
0

Riemannian Bilevel Optimization

22 May 2024
Sanchayan Dutta
Xiang Cheng
S. Sra
ArXivPDFHTML
Abstract

We develop new algorithms for Riemannian bilevel optimization. We focus in particular on batch and stochastic gradient-based methods, with the explicit goal of avoiding second-order information such as Riemannian hyper-gradients. We propose and analyze RF2SA\mathrm{RF^2SA}RF2SA, a method that leverages first-order gradient information to navigate the complex geometry of Riemannian manifolds efficiently. Notably, RF2SA\mathrm{RF^2SA}RF2SA is a single-loop algorithm, and thus easier to implement and use. Under various setups, including stochastic optimization, we provide explicit convergence rates for reaching ϵ\epsilonϵ-stationary points. We also address the challenge of optimizing over Riemannian manifolds with constraints by adjusting the multiplier in the Lagrangian, ensuring convergence to the desired solution without requiring access to second-order derivatives.

View on arXiv
Comments on this paper