Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2107.09461
Cited By
CANITA: Faster Rates for Distributed Convex Optimization with Communication Compression
20 July 2021
Zhize Li
Peter Richtárik
Re-assign community
ArXiv
PDF
HTML
Papers citing
"CANITA: Faster Rates for Distributed Convex Optimization with Communication Compression"
8 / 8 papers shown
Title
Accelerated Distributed Optimization with Compression and Error Feedback
Yuan Gao
Anton Rodomanov
Jeremy Rack
Sebastian U. Stich
49
0
0
11 Mar 2025
Federated Cubic Regularized Newton Learning with Sparsification-amplified Differential Privacy
Wei Huo
Changxin Liu
Kemi Ding
Karl H. Johansson
Ling Shi
FedML
37
0
0
08 Aug 2024
Communication Compression for Byzantine Robust Learning: New Efficient Algorithms and Improved Rates
Ahmad Rammal
Kaja Gruntkowska
Nikita Fedin
Eduard A. Gorbunov
Peter Richtárik
45
5
0
15 Oct 2023
Convergence and Privacy of Decentralized Nonconvex Optimization with Gradient Clipping and Communication Compression
Boyue Li
Yuejie Chi
21
12
0
17 May 2023
Coresets for Vertical Federated Learning: Regularized Linear Regression and
K
K
K
-Means Clustering
Lingxiao Huang
Zhize Li
Jialin Sun
Haoyu Zhao
FedML
41
9
0
26 Oct 2022
BEER: Fast
O
(
1
/
T
)
O(1/T)
O
(
1/
T
)
Rate for Decentralized Nonconvex Optimization with Communication Compression
Haoyu Zhao
Boyue Li
Zhize Li
Peter Richtárik
Yuejie Chi
24
48
0
31 Jan 2022
EF21 with Bells & Whistles: Practical Algorithmic Extensions of Modern Error Feedback
Ilyas Fatkhullin
Igor Sokolov
Eduard A. Gorbunov
Zhize Li
Peter Richtárik
46
44
0
07 Oct 2021
ANITA: An Optimal Loopless Accelerated Variance-Reduced Gradient Method
Zhize Li
43
14
0
21 Mar 2021
1