Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1803.03288
Cited By
v1
v2 (latest)
TicTac: Accelerating Distributed Deep Learning with Communication Scheduling
8 March 2018
Sayed Hadi Hashemi
Sangeetha Abdu Jyothi
R. Campbell
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"TicTac: Accelerating Distributed Deep Learning with Communication Scheduling"
4 / 54 papers shown
Title
Characterizing Deep Learning Training Workloads on Alibaba-PAI
Mengdi Wang
Chen Meng
Guoping Long
Chuan Wu
Jun Yang
Wei Lin
Yangqing Jia
68
56
0
14 Oct 2019
Taming Momentum in a Distributed Asynchronous Environment
Ido Hakimi
Saar Barkai
Moshe Gabel
Assaf Schuster
93
23
0
26 Jul 2019
Scalable Deep Learning on Distributed Infrastructures: Challenges, Techniques and Tools
R. Mayer
Hans-Arno Jacobsen
GNN
81
192
0
27 Mar 2019
Optimizing Network Performance for Distributed DNN Training on GPU Clusters: ImageNet/AlexNet Training in 1.5 Minutes
Peng Sun
Wansen Feng
Ruobing Han
Shengen Yan
Yonggang Wen
AI4CE
100
70
0
19 Feb 2019
Previous
1
2