ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.11787
  4. Cited By
Moniqua: Modulo Quantized Communication in Decentralized SGD

Moniqua: Modulo Quantized Communication in Decentralized SGD

26 February 2020
Yucheng Lu
Christopher De Sa
    MQ
ArXivPDFHTML

Papers citing "Moniqua: Modulo Quantized Communication in Decentralized SGD"

8 / 8 papers shown
Title
Beyond Exponential Graph: Communication-Efficient Topologies for
  Decentralized Learning via Finite-time Convergence
Beyond Exponential Graph: Communication-Efficient Topologies for Decentralized Learning via Finite-time Convergence
Yuki Takezawa
Ryoma Sato
Han Bao
Kenta Niwa
M. Yamada
31
9
0
19 May 2023
Lower Bounds and Accelerated Algorithms in Distributed Stochastic Optimization with Communication Compression
Lower Bounds and Accelerated Algorithms in Distributed Stochastic Optimization with Communication Compression
Yutong He
Xinmeng Huang
Yiming Chen
W. Yin
Kun Yuan
28
7
0
12 May 2023
GraB: Finding Provably Better Data Permutations than Random Reshuffling
GraB: Finding Provably Better Data Permutations than Random Reshuffling
Yucheng Lu
Wentao Guo
Christopher De Sa
FedML
18
16
0
22 May 2022
Maximizing Communication Efficiency for Large-scale Training via 0/1
  Adam
Maximizing Communication Efficiency for Large-scale Training via 0/1 Adam
Yucheng Lu
Conglong Li
Minjia Zhang
Christopher De Sa
Yuxiong He
OffRL
AI4CE
24
20
0
12 Feb 2022
Decentralized Composite Optimization with Compression
Decentralized Composite Optimization with Compression
Yao Li
Xiaorui Liu
Jiliang Tang
Ming Yan
Kun Yuan
19
9
0
10 Aug 2021
PowerGossip: Practical Low-Rank Communication Compression in
  Decentralized Deep Learning
PowerGossip: Practical Low-Rank Communication Compression in Decentralized Deep Learning
Thijs Vogels
Sai Praneeth Karimireddy
Martin Jaggi
FedML
9
54
0
04 Aug 2020
Optimal Complexity in Decentralized Training
Optimal Complexity in Decentralized Training
Yucheng Lu
Christopher De Sa
27
71
0
15 Jun 2020
MixML: A Unified Analysis of Weakly Consistent Parallel Learning
MixML: A Unified Analysis of Weakly Consistent Parallel Learning
Yucheng Lu
J. Nash
Christopher De Sa
FedML
21
12
0
14 May 2020
1