ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2107.14575
  4. Cited By
DQ-SGD: Dynamic Quantization in SGD for Communication-Efficient
  Distributed Learning

DQ-SGD: Dynamic Quantization in SGD for Communication-Efficient Distributed Learning

30 July 2021
Guangfeng Yan
Shao-Lun Huang
Tian-Shing Lan
Linqi Song
    MQ
ArXivPDFHTML

Papers citing "DQ-SGD: Dynamic Quantization in SGD for Communication-Efficient Distributed Learning"

2 / 2 papers shown
Title
Layered Randomized Quantization for Communication-Efficient and
  Privacy-Preserving Distributed Learning
Layered Randomized Quantization for Communication-Efficient and Privacy-Preserving Distributed Learning
Guangfeng Yan
Tan Li
Tian-Shing Lan
Kui Wu
Linqi Song
19
6
0
12 Dec 2023
Rethinking gradient sparsification as total error minimization
Rethinking gradient sparsification as total error minimization
Atal Narayan Sahu
Aritra Dutta
A. Abdelmoniem
Trambak Banerjee
Marco Canini
Panos Kalnis
43
54
0
02 Aug 2021
1