Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2002.08958
Cited By
Uncertainty Principle for Communication Compression in Distributed and Federated Learning and the Search for an Optimal Compressor
20 February 2020
M. Safaryan
Egor Shulgin
Peter Richtárik
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Uncertainty Principle for Communication Compression in Distributed and Federated Learning and the Search for an Optimal Compressor"
10 / 10 papers shown
Title
Correlated Quantization for Faster Nonconvex Distributed Optimization
Andrei Panferov
Yury Demidovich
Ahmad Rammal
Peter Richtárik
MQ
25
4
0
10 Jan 2024
Matrix Compression via Randomized Low Rank and Low Precision Factorization
R. Saha
Varun Srivastava
Mert Pilanci
13
19
0
17 Oct 2023
Lower Bounds and Accelerated Algorithms in Distributed Stochastic Optimization with Communication Compression
Yutong He
Xinmeng Huang
Yiming Chen
W. Yin
Kun Yuan
26
7
0
12 May 2023
Breaking the Communication-Privacy-Accuracy Tradeoff with
f
f
f
-Differential Privacy
Richeng Jin
Z. Su
C. Zhong
Zhaoyang Zhang
Tony Q. S. Quek
H. Dai
FedML
19
2
0
19 Feb 2023
CEDAS: A Compressed Decentralized Stochastic Gradient Method with Improved Convergence
Kun-Yen Huang
Shin-Yi Pu
30
9
0
14 Jan 2023
Minimax Optimal Quantization of Linear Models: Information-Theoretic Limits and Efficient Algorithms
R. Saha
Mert Pilanci
Andrea J. Goldsmith
MQ
17
3
0
23 Feb 2022
Wyner-Ziv Gradient Compression for Federated Learning
Kai Liang
Huiru Zhong
Haoning Chen
Youlong Wu
FedML
14
8
0
16 Nov 2021
Rethinking gradient sparsification as total error minimization
Atal Narayan Sahu
Aritra Dutta
A. Abdelmoniem
Trambak Banerjee
Marco Canini
Panos Kalnis
37
54
0
02 Aug 2021
Efficient Randomized Subspace Embeddings for Distributed Optimization under a Communication Budget
R. Saha
Mert Pilanci
Andrea J. Goldsmith
11
5
0
13 Mar 2021
MARINA: Faster Non-Convex Distributed Learning with Compression
Eduard A. Gorbunov
Konstantin Burlachenko
Zhize Li
Peter Richtárik
28
108
0
15 Feb 2021
1