ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.19319
24
0

Fully First-Order Methods for Decentralized Bilevel Optimization

25 October 2024
Xiaoyu Wang
Xuxing Chen
Shiqian Ma
Tong Zhang
ArXivPDFHTML
Abstract

This paper focuses on decentralized stochastic bilevel optimization (DSBO) where agents only communicate with their neighbors. We propose Decentralized Stochastic Gradient Descent and Ascent with Gradient Tracking (DSGDA-GT), a novel algorithm that only requires first-order oracles that are much cheaper than second-order oracles widely adopted in existing works. We further provide a finite-time convergence analysis showing that for nnn agents collaboratively solving the DSBO problem, the sample complexity of finding an ϵ\epsilonϵ-stationary point in our algorithm is O(n−1ϵ−7)\mathcal{O}(n^{-1}\epsilon^{-7})O(n−1ϵ−7), which matches the currently best-known results of the single-agent counterpart with linear speedup. The numerical experiments demonstrate both the communication and training efficiency of our algorithm.

View on arXiv
@article{wang2025_2410.19319,
  title={ Fully First-Order Methods for Decentralized Bilevel Optimization },
  author={ Xiaoyu Wang and Xuxing Chen and Shiqian Ma and Tong Zhang },
  journal={arXiv preprint arXiv:2410.19319},
  year={ 2025 }
}
Comments on this paper