Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2102.12528
Cited By
v1
v2 (latest)
Preserved central model for faster bidirectional compression in distributed settings
24 February 2021
Constantin Philippenko
Aymeric Dieuleveut
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Preserved central model for faster bidirectional compression in distributed settings"
15 / 15 papers shown
Title
Tight analyses of first-order methods with error feedback
Daniel Berg Thomsen
Adrien B. Taylor
Aymeric Dieuleveut
96
1
0
05 Jun 2025
LoCoDL: Communication-Efficient Distributed Learning with Local Training and Compression
Laurent Condat
Artavazd Maranjyan
Peter Richtárik
123
5
0
07 Mar 2024
Communication Compression for Byzantine Robust Learning: New Efficient Algorithms and Improved Rates
Ahmad Rammal
Kaja Gruntkowska
Nikita Fedin
Eduard A. Gorbunov
Peter Richtárik
85
5
0
15 Oct 2023
Knowledge Distillation Performs Partial Variance Reduction
M. Safaryan
Alexandra Peste
Dan Alistarh
84
7
0
27 May 2023
Lower Bounds and Accelerated Algorithms in Distributed Stochastic Optimization with Communication Compression
Yutong He
Xinmeng Huang
Yiming Chen
W. Yin
Kun Yuan
113
7
0
12 May 2023
TAMUNA: Doubly Accelerated Distributed Optimization with Local Training, Compression, and Partial Participation
Laurent Condat
Ivan Agarský
Grigory Malinovsky
Peter Richtárik
FedML
108
4
0
20 Feb 2023
DoCoFL: Downlink Compression for Cross-Device Federated Learning
Ron Dorfman
S. Vargaftik
Y. Ben-Itzhak
Kfir Y. Levy
FedML
101
22
0
01 Feb 2023
Provably Doubly Accelerated Federated Learning: The First Theoretically Successful Combination of Local Training and Communication Compression
Laurent Condat
Ivan Agarský
Peter Richtárik
FedML
113
17
0
24 Oct 2022
EF21-P and Friends: Improved Theoretical Communication Complexity for Distributed Optimization with Bidirectional Compression
Kaja Gruntkowska
Alexander Tyurin
Peter Richtárik
150
24
0
30 Sep 2022
Downlink Compression Improves TopK Sparsification
William Zou
H. Sterck
Jun Liu
38
0
0
30 Sep 2022
Lower Bounds and Nearly Optimal Algorithms in Distributed Learning with Communication Compression
Xinmeng Huang
Yiming Chen
W. Yin
Kun Yuan
92
27
0
08 Jun 2022
FedShuffle: Recipes for Better Use of Local Work in Federated Learning
Samuel Horváth
Maziar Sanjabi
Lin Xiao
Peter Richtárik
Michael G. Rabbat
FedML
88
21
0
27 Apr 2022
Federated Expectation Maximization with heterogeneity mitigation and variance reduction
Aymeric Dieuleveut
G. Fort
Eric Moulines
Geneviève Robin
FedML
86
5
0
03 Nov 2021
Bidirectional compression in heterogeneous settings for distributed or federated learning with partial participation: tight convergence guarantees
Constantin Philippenko
Aymeric Dieuleveut
FedML
93
51
0
25 Jun 2020
Distributed Learning with Compressed Gradient Differences
Konstantin Mishchenko
Eduard A. Gorbunov
Martin Takáč
Peter Richtárik
142
202
0
26 Jan 2019
1