ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.13277
  4. Cited By
Provably Doubly Accelerated Federated Learning: The First Theoretically
  Successful Combination of Local Training and Communication Compression

Provably Doubly Accelerated Federated Learning: The First Theoretically Successful Combination of Local Training and Communication Compression

24 October 2022
Laurent Condat
Ivan Agarský
Peter Richtárik
    FedML
ArXivPDFHTML

Papers citing "Provably Doubly Accelerated Federated Learning: The First Theoretically Successful Combination of Local Training and Communication Compression"

7 / 7 papers shown
Title
LoCoDL: Communication-Efficient Distributed Learning with Local Training and Compression
LoCoDL: Communication-Efficient Distributed Learning with Local Training and Compression
Laurent Condat
A. Maranjyan
Peter Richtárik
39
3
0
07 Mar 2024
Similarity, Compression and Local Steps: Three Pillars of Efficient
  Communications for Distributed Variational Inequalities
Similarity, Compression and Local Steps: Three Pillars of Efficient Communications for Distributed Variational Inequalities
Aleksandr Beznosikov
Martin Takáč
Alexander Gasnikov
21
10
0
15 Feb 2023
EF21 with Bells & Whistles: Practical Algorithmic Extensions of Modern
  Error Feedback
EF21 with Bells & Whistles: Practical Algorithmic Extensions of Modern Error Feedback
Ilyas Fatkhullin
Igor Sokolov
Eduard A. Gorbunov
Zhize Li
Peter Richtárik
44
44
0
07 Oct 2021
A Field Guide to Federated Optimization
A Field Guide to Federated Optimization
Jianyu Wang
Zachary B. Charles
Zheng Xu
Gauri Joshi
H. B. McMahan
...
Mi Zhang
Tong Zhang
Chunxiang Zheng
Chen Zhu
Wennan Zhu
FedML
173
411
0
14 Jul 2021
Linear Convergence in Federated Learning: Tackling Client Heterogeneity
  and Sparse Gradients
Linear Convergence in Federated Learning: Tackling Client Heterogeneity and Sparse Gradients
A. Mitra
Rayana H. Jaafar
George J. Pappas
Hamed Hassani
FedML
55
157
0
14 Feb 2021
Linearly Converging Error Compensated SGD
Linearly Converging Error Compensated SGD
Eduard A. Gorbunov
D. Kovalev
Dmitry Makarenko
Peter Richtárik
163
77
0
23 Oct 2020
FedPAQ: A Communication-Efficient Federated Learning Method with
  Periodic Averaging and Quantization
FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization
Amirhossein Reisizadeh
Aryan Mokhtari
Hamed Hassani
Ali Jadbabaie
Ramtin Pedarsani
FedML
159
760
0
28 Sep 2019
1