ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.05100
  4. Cited By
Communication-Efficient Gradient Descent-Accent Methods for Distributed
  Variational Inequalities: Unified Analysis and Local Updates

Communication-Efficient Gradient Descent-Accent Methods for Distributed Variational Inequalities: Unified Analysis and Local Updates

8 June 2023
Siqi Zhang
S. Choudhury
Sebastian U. Stich
Nicolas Loizou
    FedML
ArXivPDFHTML

Papers citing "Communication-Efficient Gradient Descent-Accent Methods for Distributed Variational Inequalities: Unified Analysis and Local Updates"

3 / 3 papers shown
Title
Achieving Near-Optimal Convergence for Distributed Minimax Optimization
  with Adaptive Stepsizes
Achieving Near-Optimal Convergence for Distributed Minimax Optimization with Adaptive Stepsizes
Yan Huang
Xiang Li
Yipeng Shen
Niao He
Jinming Xu
33
1
0
05 Jun 2024
A Field Guide to Federated Optimization
A Field Guide to Federated Optimization
Jianyu Wang
Zachary B. Charles
Zheng Xu
Gauri Joshi
H. B. McMahan
...
Mi Zhang
Tong Zhang
Chunxiang Zheng
Chen Zhu
Wennan Zhu
FedML
173
411
0
14 Jul 2021
Stochastic Variance Reduction for Variational Inequality Methods
Stochastic Variance Reduction for Variational Inequality Methods
Ahmet Alacaoglu
Yura Malitsky
56
68
0
16 Feb 2021
1