ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2006.10103
  4. Cited By
Is Network the Bottleneck of Distributed Training?

Is Network the Bottleneck of Distributed Training?

17 June 2020
Zhen Zhang
Chaokun Chang
Haibin Lin
Yida Wang
R. Arora
Xin Jin
ArXivPDFHTML

Papers citing "Is Network the Bottleneck of Distributed Training?"

5 / 5 papers shown
Title
The Evolution of Distributed Systems for Graph Neural Networks and their
  Origin in Graph Processing and Deep Learning: A Survey
The Evolution of Distributed Systems for Graph Neural Networks and their Origin in Graph Processing and Deep Learning: A Survey
Jana Vatter
R. Mayer
Hans-Arno Jacobsen
GNN
AI4TS
AI4CE
33
23
0
23 May 2023
GraVAC: Adaptive Compression for Communication-Efficient Distributed DL
  Training
GraVAC: Adaptive Compression for Communication-Efficient Distributed DL Training
S. Tyagi
Martin Swany
19
4
0
20 May 2023
Communication-efficient distributed eigenspace estimation with arbitrary
  node failures
Communication-efficient distributed eigenspace estimation with arbitrary node failures
Vasileios Charisopoulos
Anil Damle
11
1
0
31 May 2022
FuncPipe: A Pipelined Serverless Framework for Fast and Cost-efficient
  Training of Deep Learning Models
FuncPipe: A Pipelined Serverless Framework for Fast and Cost-efficient Training of Deep Learning Models
Yunzhuo Liu
Bo Jiang
Tian Guo
Zimeng Huang
Wen-ping Ma
Xinbing Wang
Chenghu Zhou
17
9
0
28 Apr 2022
On the Utility of Gradient Compression in Distributed Training Systems
On the Utility of Gradient Compression in Distributed Training Systems
Saurabh Agarwal
Hongyi Wang
Shivaram Venkataraman
Dimitris Papailiopoulos
23
46
0
28 Feb 2021
1