Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2006.10103
Cited By
Is Network the Bottleneck of Distributed Training?
17 June 2020
Zhen Zhang
Chaokun Chang
Haibin Lin
Yida Wang
R. Arora
Xin Jin
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Is Network the Bottleneck of Distributed Training?"
5 / 5 papers shown
Title
The Evolution of Distributed Systems for Graph Neural Networks and their Origin in Graph Processing and Deep Learning: A Survey
Jana Vatter
R. Mayer
Hans-Arno Jacobsen
GNN
AI4TS
AI4CE
33
23
0
23 May 2023
GraVAC: Adaptive Compression for Communication-Efficient Distributed DL Training
S. Tyagi
Martin Swany
19
4
0
20 May 2023
Communication-efficient distributed eigenspace estimation with arbitrary node failures
Vasileios Charisopoulos
Anil Damle
11
1
0
31 May 2022
FuncPipe: A Pipelined Serverless Framework for Fast and Cost-efficient Training of Deep Learning Models
Yunzhuo Liu
Bo Jiang
Tian Guo
Zimeng Huang
Wen-ping Ma
Xinbing Wang
Chenghu Zhou
17
9
0
28 Apr 2022
On the Utility of Gradient Compression in Distributed Training Systems
Saurabh Agarwal
Hongyi Wang
Shivaram Venkataraman
Dimitris Papailiopoulos
23
46
0
28 Feb 2021
1