ResearchTrend.AI
  • Papers
  • Communities
  • Organizations
  • Events
  • Blog
  • Pricing
  • Feedback
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2110.03313
  4. Cited By
Distributed Methods with Compressed Communication for Solving
  Variational Inequalities, with Theoretical Guarantees
v1v2v3 (latest)

Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees

7 October 2021
Aleksandr Beznosikov
Peter Richtárik
Michael Diskin
Max Ryabinin
Alexander Gasnikov
    FedML
ArXiv (abs)PDFHTML

Papers citing "Distributed Methods with Compressed Communication for Solving Variational Inequalities, with Theoretical Guarantees"

7 / 7 papers shown
Title
Near-Optimal Distributed Minimax Optimization under the Second-Order
  Similarity
Near-Optimal Distributed Minimax Optimization under the Second-Order Similarity
Qihao Zhou
Haishan Ye
Luo Luo
125
1
0
25 May 2024
Communication Compression for Byzantine Robust Learning: New Efficient
  Algorithms and Improved Rates
Communication Compression for Byzantine Robust Learning: New Efficient Algorithms and Improved Rates
Ahmad Rammal
Kaja Gruntkowska
Nikita Fedin
Eduard A. Gorbunov
Peter Richtárik
124
7
0
15 Oct 2023
Distributed Extra-gradient with Optimal Complexity and Communication
  Guarantees
Distributed Extra-gradient with Optimal Complexity and Communication Guarantees
Ali Ramezani-Kebrya
Kimon Antonakopoulos
Igor Krawczuk
Justin Deschenaux
Volkan Cevher
117
4
0
17 Aug 2023
Similarity, Compression and Local Steps: Three Pillars of Efficient
  Communications for Distributed Variational Inequalities
Similarity, Compression and Local Steps: Three Pillars of Efficient Communications for Distributed Variational Inequalities
Aleksandr Beznosikov
Martin Takáč
Alexander Gasnikov
108
12
0
15 Feb 2023
Compression and Data Similarity: Combination of Two Techniques for
  Communication-Efficient Solving of Distributed Variational Inequalities
Compression and Data Similarity: Combination of Two Techniques for Communication-Efficient Solving of Distributed Variational Inequalities
Aleksandr Beznosikov
Alexander Gasnikov
87
11
0
19 Jun 2022
Federated Minimax Optimization: Improved Convergence Analyses and
  Algorithms
Federated Minimax Optimization: Improved Convergence Analyses and Algorithms
Pranay Sharma
Rohan Panda
Gauri Joshi
P. Varshney
FedML
148
52
0
09 Mar 2022
Stochastic Gradient Descent-Ascent: Unified Theory and New Efficient
  Methods
Stochastic Gradient Descent-Ascent: Unified Theory and New Efficient Methods
Aleksandr Beznosikov
Eduard A. Gorbunov
Hugo Berard
Nicolas Loizou
141
52
0
15 Feb 2022
1