Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2003.03009
Cited By
Communication optimization strategies for distributed deep neural network training: A survey
6 March 2020
Shuo Ouyang
Dezun Dong
Yemao Xu
Liquan Xiao
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Communication optimization strategies for distributed deep neural network training: A survey"
5 / 5 papers shown
Title
Fair and Efficient Distributed Edge Learning with Hybrid Multipath TCP
Shiva Raj Pokhrel
Jinho D. Choi
A. Walid
25
6
0
03 Nov 2022
DBS: Dynamic Batch Size For Distributed Deep Neural Network Training
Qing Ye
Yuhao Zhou
Mingjia Shi
Yanan Sun
Jiancheng Lv
14
11
0
23 Jul 2020
Enabling Compute-Communication Overlap in Distributed Deep Learning Training Platforms
Saeed Rashidi
Matthew Denton
Srinivas Sridharan
S. Srinivasan
Amoghavarsha Suresh
Jade Nie
T. Krishna
13
45
0
30 Jun 2020
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
273
2,886
0
15 Sep 2016
Optimal Distributed Online Prediction using Mini-Batches
O. Dekel
Ran Gilad-Bachrach
Ohad Shamir
Lin Xiao
166
683
0
07 Dec 2010
1