ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.07612
  4. Cited By
Lower Bounds and Accelerated Algorithms in Distributed Stochastic Optimization with Communication Compression

Lower Bounds and Accelerated Algorithms in Distributed Stochastic Optimization with Communication Compression

12 May 2023
Yutong He
Xinmeng Huang
Yiming Chen
W. Yin
Kun Yuan
ArXivPDFHTML

Papers citing "Lower Bounds and Accelerated Algorithms in Distributed Stochastic Optimization with Communication Compression"

9 / 9 papers shown
Title
Accelerated Distributed Optimization with Compression and Error Feedback
Accelerated Distributed Optimization with Compression and Error Feedback
Yuan Gao
Anton Rodomanov
Jeremy Rack
Sebastian U. Stich
43
0
0
11 Mar 2025
Single-Timescale Multi-Sequence Stochastic Approximation Without Fixed
  Point Smoothness: Theories and Applications
Single-Timescale Multi-Sequence Stochastic Approximation Without Fixed Point Smoothness: Theories and Applications
Yue Huang
Zhaoxian Wu
Shiqian Ma
Qing Ling
31
1
0
17 Oct 2024
EControl: Fast Distributed Optimization with Compression and Error
  Control
EControl: Fast Distributed Optimization with Compression and Error Control
Yuan Gao
Rustem Islamov
Sebastian U. Stich
25
6
0
06 Nov 2023
Stochastic Controlled Averaging for Federated Learning with
  Communication Compression
Stochastic Controlled Averaging for Federated Learning with Communication Compression
Xinmeng Huang
Ping Li
Xiaoyun Li
28
193
0
16 Aug 2023
Momentum Benefits Non-IID Federated Learning Simply and Provably
Momentum Benefits Non-IID Federated Learning Simply and Provably
Ziheng Cheng
Xinmeng Huang
Pengfei Wu
Kun Yuan
FedML
13
16
0
28 Jun 2023
Unbiased Compression Saves Communication in Distributed Optimization:
  When and How Much?
Unbiased Compression Saves Communication in Distributed Optimization: When and How Much?
Yutong He
Xinmeng Huang
Kun Yuan
18
8
0
25 May 2023
DASHA: Distributed Nonconvex Optimization with Communication
  Compression, Optimal Oracle Complexity, and No Client Synchronization
DASHA: Distributed Nonconvex Optimization with Communication Compression, Optimal Oracle Complexity, and No Client Synchronization
A. Tyurin
Peter Richtárik
29
17
0
02 Feb 2022
EF21 with Bells & Whistles: Practical Algorithmic Extensions of Modern
  Error Feedback
EF21 with Bells & Whistles: Practical Algorithmic Extensions of Modern Error Feedback
Ilyas Fatkhullin
Igor Sokolov
Eduard A. Gorbunov
Zhize Li
Peter Richtárik
42
44
0
07 Oct 2021
DecentLaM: Decentralized Momentum SGD for Large-batch Deep Training
DecentLaM: Decentralized Momentum SGD for Large-batch Deep Training
Kun Yuan
Yiming Chen
Xinmeng Huang
Yingya Zhang
Pan Pan
Yinghui Xu
W. Yin
MoE
44
60
0
24 Apr 2021
1