ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1812.04048
  4. Cited By
Compressed Distributed Gradient Descent: Communication-Efficient
  Consensus over Networks
v1v2v3v4v5 (latest)

Compressed Distributed Gradient Descent: Communication-Efficient Consensus over Networks

10 December 2018
Xin Zhang
Jia Liu
Zhengyuan Zhu
Elizabeth S. Bentley
ArXiv (abs)PDFHTML

Papers citing "Compressed Distributed Gradient Descent: Communication-Efficient Consensus over Networks"

13 / 13 papers shown
Title
Multi-Tier Computing-Enabled Digital Twin in 6G Networks
Multi-Tier Computing-Enabled Digital Twin in 6G Networks
Kunlun Wang
Yongyi Tang
T. Duong
Saeed R. Khosravirad
O. Dobre
G. Karagiannidis
AI4CE
84
3
0
28 Dec 2023
Serverless Federated AUPRC Optimization for Multi-Party Collaborative
  Imbalanced Data Mining
Serverless Federated AUPRC Optimization for Multi-Party Collaborative Imbalanced Data Mining
Xidong Wu
Zhengmian Hu
Jian Pei
Heng Huang
103
12
0
06 Aug 2023
DIAMOND: Taming Sample and Communication Complexities in Decentralized
  Bilevel Optimization
DIAMOND: Taming Sample and Communication Complexities in Decentralized Bilevel Optimization
Pei-Yuan Qiu
Yining Li
Zhuqing Liu
Prashant Khanduri
Jia Liu
Ness B. Shroff
Elizabeth S. Bentley
K. Turck
96
4
0
05 Dec 2022
Quantization for decentralized learning under subspace constraints
Quantization for decentralized learning under subspace constraints
Roula Nassif
Stefan Vlaski
Marco Carpentiero
Vincenzo Matta
Marc Antonini
Ali H. Sayed
97
30
0
16 Sep 2022
Finite-Bit Quantization For Distributed Algorithms With Linear
  Convergence
Finite-Bit Quantization For Distributed Algorithms With Linear Convergence
Nicolò Michelusi
G. Scutari
Chang-Shen Lee
MQ
71
27
0
23 Jul 2021
Crossover-SGD: A gossip-based communication in distributed deep learning
  for alleviating large mini-batch problem and enhancing scalability
Crossover-SGD: A gossip-based communication in distributed deep learning for alleviating large mini-batch problem and enhancing scalability
Sangho Yeo
Minho Bae
Minjoong Jeong
Oh-Kyoung Kwon
Sangyoon Oh
68
3
0
30 Dec 2020
Asynchronous Decentralized Learning of a Neural Network
Asynchronous Decentralized Learning of a Neural Network
Xinyue Liang
Alireza M. Javid
Mikael Skoglund
Saikat Chatterjee
56
8
0
10 Apr 2020
Private and Communication-Efficient Edge Learning: A Sparse Differential
  Gaussian-Masking Distributed SGD Approach
Private and Communication-Efficient Edge Learning: A Sparse Differential Gaussian-Masking Distributed SGD Approach
Xin Zhang
Minghong Fang
Jia-Wei Liu
Zhengyuan Zhu
FedML
89
27
0
12 Jan 2020
Distributed Fixed Point Methods with Compressed Iterates
Distributed Fixed Point Methods with Compressed Iterates
Sélim Chraibi
Ahmed Khaled
D. Kovalev
Peter Richtárik
Adil Salim
Martin Takávc
FedML
55
17
0
20 Dec 2019
Communication-Efficient Network-Distributed Optimization with
  Differential-Coded Compressors
Communication-Efficient Network-Distributed Optimization with Differential-Coded Compressors
Xin Zhang
Jia-Wei Liu
Zhengyuan Zhu
Elizabeth S. Bentley
42
7
0
06 Dec 2019
FedPAQ: A Communication-Efficient Federated Learning Method with
  Periodic Averaging and Quantization
FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization
Amirhossein Reisizadeh
Aryan Mokhtari
Hamed Hassani
Ali Jadbabaie
Ramtin Pedarsani
FedML
264
777
0
28 Sep 2019
Robust and Communication-Efficient Collaborative Learning
Robust and Communication-Efficient Collaborative Learning
Amirhossein Reisizadeh
Hossein Taheri
Aryan Mokhtari
Hamed Hassani
Ramtin Pedarsani
103
93
0
24 Jul 2019
An Exact Quantized Decentralized Gradient Descent Algorithm
An Exact Quantized Decentralized Gradient Descent Algorithm
Amirhossein Reisizadeh
Aryan Mokhtari
Hamed Hassani
Ramtin Pedarsani
110
126
0
29 Jun 2018
1