ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1808.04357
  4. Cited By
RedSync : Reducing Synchronization Traffic for Distributed Deep Learning

RedSync : Reducing Synchronization Traffic for Distributed Deep Learning

13 August 2018
Jiarui Fang
Haohuan Fu
Guangwen Yang
Cho-Jui Hsieh
    GNN
ArXivPDFHTML

Papers citing "RedSync : Reducing Synchronization Traffic for Distributed Deep Learning"

4 / 4 papers shown
Title
GraVAC: Adaptive Compression for Communication-Efficient Distributed DL
  Training
GraVAC: Adaptive Compression for Communication-Efficient Distributed DL Training
S. Tyagi
Martin Swany
35
4
0
20 May 2023
Communication optimization strategies for distributed deep neural
  network training: A survey
Communication optimization strategies for distributed deep neural network training: A survey
Shuo Ouyang
Dezun Dong
Yemao Xu
Liquan Xiao
30
12
0
06 Mar 2020
A Distributed Synchronous SGD Algorithm with Global Top-$k$
  Sparsification for Low Bandwidth Networks
A Distributed Synchronous SGD Algorithm with Global Top-kkk Sparsification for Low Bandwidth Networks
Shaoshuai Shi
Qiang-qiang Wang
Kaiyong Zhao
Zhenheng Tang
Yuxin Wang
Xiang Huang
Xuming Hu
40
135
0
14 Jan 2019
3LC: Lightweight and Effective Traffic Compression for Distributed
  Machine Learning
3LC: Lightweight and Effective Traffic Compression for Distributed Machine Learning
Hyeontaek Lim
D. Andersen
M. Kaminsky
21
70
0
21 Feb 2018
1