ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2301.05872
  4. Cited By
CEDAS: A Compressed Decentralized Stochastic Gradient Method with
  Improved Convergence

CEDAS: A Compressed Decentralized Stochastic Gradient Method with Improved Convergence

14 January 2023
Kun-Yen Huang
Shin-Yi Pu
ArXivPDFHTML

Papers citing "CEDAS: A Compressed Decentralized Stochastic Gradient Method with Improved Convergence"

5 / 5 papers shown
Title
An Accelerated Distributed Stochastic Gradient Method with Momentum
An Accelerated Distributed Stochastic Gradient Method with Momentum
Kun-Yen Huang
Shi Pu
Angelia Nedić
14
8
0
15 Feb 2024
Distributed Random Reshuffling Methods with Improved Convergence
Distributed Random Reshuffling Methods with Improved Convergence
Kun-Yen Huang
Linli Zhou
Shi Pu
17
4
0
21 Jun 2023
Convergence and Privacy of Decentralized Nonconvex Optimization with
  Gradient Clipping and Communication Compression
Convergence and Privacy of Decentralized Nonconvex Optimization with Gradient Clipping and Communication Compression
Boyue Li
Yuejie Chi
21
12
0
17 May 2023
DoCoM: Compressed Decentralized Optimization with Near-Optimal Sample
  Complexity
DoCoM: Compressed Decentralized Optimization with Near-Optimal Sample Complexity
Chung-Yiu Yau
Hoi-To Wai
83
6
0
01 Feb 2022
IntSGD: Adaptive Floatless Compression of Stochastic Gradients
IntSGD: Adaptive Floatless Compression of Stochastic Gradients
Konstantin Mishchenko
Bokun Wang
D. Kovalev
Peter Richtárik
67
14
0
16 Feb 2021
1