Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1909.02625
Cited By
On the Acceleration of Deep Learning Model Parallelism with Staleness
5 September 2019
An Xu
Zhouyuan Huo
Heng-Chiao Huang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"On the Acceleration of Deep Learning Model Parallelism with Staleness"
7 / 7 papers shown
Title
PETRA: Parallel End-to-end Training with Reversible Architectures
Stéphane Rivaud
Louis Fournier
Thomas Pumir
Eugene Belilovsky
Michael Eickenberg
Edouard Oyallon
25
0
0
04 Jun 2024
Cyclic Data Parallelism for Efficient Parallelism of Deep Neural Networks
Louis Fournier
Edouard Oyallon
52
0
0
13 Mar 2024
Layer-Wise Partitioning and Merging for Efficient and Scalable Deep Learning
S. Akintoye
Liangxiu Han
H. Lloyd
Xin Zhang
Darren Dancey
Haoming Chen
Daoqiang Zhang
FedML
34
5
0
22 Jul 2022
Enabling All In-Edge Deep Learning: A Literature Review
Praveen Joshi
Mohammed Hasanuzzaman
Chandra Thapa
Haithem Afli
T. Scully
43
22
0
07 Apr 2022
Closing the Generalization Gap of Cross-silo Federated Medical Image Segmentation
An Xu
Wenqi Li
Pengfei Guo
Dong Yang
H. Roth
Ali Hatamizadeh
Can Zhao
Daguang Xu
Heng-Chiao Huang
Ziyue Xu
FedML
38
51
0
18 Mar 2022
A Machine Learning Framework for Distributed Functional Compression over Wireless Channels in IoT
Yashas Malur Saidutta
Afshin Abdi
Faramarz Fekri
AI4CE
19
4
0
24 Jan 2022
Detached Error Feedback for Distributed SGD with Random Sparsification
An Xu
Heng-Chiao Huang
39
9
0
11 Apr 2020
1