Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2301.05872
Cited By
CEDAS: A Compressed Decentralized Stochastic Gradient Method with Improved Convergence
14 January 2023
Kun-Yen Huang
Shin-Yi Pu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"CEDAS: A Compressed Decentralized Stochastic Gradient Method with Improved Convergence"
5 / 5 papers shown
Title
An Accelerated Distributed Stochastic Gradient Method with Momentum
Kun-Yen Huang
Shi Pu
Angelia Nedić
14
8
0
15 Feb 2024
Distributed Random Reshuffling Methods with Improved Convergence
Kun-Yen Huang
Linli Zhou
Shi Pu
17
4
0
21 Jun 2023
Convergence and Privacy of Decentralized Nonconvex Optimization with Gradient Clipping and Communication Compression
Boyue Li
Yuejie Chi
21
12
0
17 May 2023
DoCoM: Compressed Decentralized Optimization with Near-Optimal Sample Complexity
Chung-Yiu Yau
Hoi-To Wai
83
6
0
01 Feb 2022
IntSGD: Adaptive Floatless Compression of Stochastic Gradients
Konstantin Mishchenko
Bokun Wang
D. Kovalev
Peter Richtárik
67
14
0
16 Feb 2021
1