Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2210.13277
Cited By
Provably Doubly Accelerated Federated Learning: The First Theoretically Successful Combination of Local Training and Communication Compression
24 October 2022
Laurent Condat
Ivan Agarský
Peter Richtárik
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Provably Doubly Accelerated Federated Learning: The First Theoretically Successful Combination of Local Training and Communication Compression"
7 / 7 papers shown
Title
LoCoDL: Communication-Efficient Distributed Learning with Local Training and Compression
Laurent Condat
A. Maranjyan
Peter Richtárik
39
3
0
07 Mar 2024
Similarity, Compression and Local Steps: Three Pillars of Efficient Communications for Distributed Variational Inequalities
Aleksandr Beznosikov
Martin Takáč
Alexander Gasnikov
21
10
0
15 Feb 2023
EF21 with Bells & Whistles: Practical Algorithmic Extensions of Modern Error Feedback
Ilyas Fatkhullin
Igor Sokolov
Eduard A. Gorbunov
Zhize Li
Peter Richtárik
44
44
0
07 Oct 2021
A Field Guide to Federated Optimization
Jianyu Wang
Zachary B. Charles
Zheng Xu
Gauri Joshi
H. B. McMahan
...
Mi Zhang
Tong Zhang
Chunxiang Zheng
Chen Zhu
Wennan Zhu
FedML
173
411
0
14 Jul 2021
Linear Convergence in Federated Learning: Tackling Client Heterogeneity and Sparse Gradients
A. Mitra
Rayana H. Jaafar
George J. Pappas
Hamed Hassani
FedML
55
157
0
14 Feb 2021
Linearly Converging Error Compensated SGD
Eduard A. Gorbunov
D. Kovalev
Dmitry Makarenko
Peter Richtárik
163
77
0
23 Oct 2020
FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization
Amirhossein Reisizadeh
Aryan Mokhtari
Hamed Hassani
Ali Jadbabaie
Ramtin Pedarsani
FedML
159
760
0
28 Sep 2019
1