Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1910.02270
Cited By
Parallelizing Training of Deep Generative Models on Massive Scientific Datasets
5 October 2019
S. A. Jacobs
B. Van Essen
D. Hysom
Jae-Seung Yeom
Tim Moon
Rushil Anirudh
Jayaraman J. Thiagarajan
Shusen Liu
P. Bremer
J. Gaffney
Tom Benson
Peter B. Robinson
L. Peterson
B. Spears
BDL
AI4CE
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Parallelizing Training of Deep Generative Models on Massive Scientific Datasets"
3 / 3 papers shown
Title
SOLAR: A Highly Optimized Data Loading Framework for Distributed Training of CNN-based Scientific Surrogates
Baixi Sun
Xiaodong Yu
Chengming Zhang
Jiannan Tian
Sian Jin
K. Iskra
Tao Zhou
Tekin Bicer
Pete Beckman
Dingwen Tao
16
1
0
01 Nov 2022
The Case for Strong Scaling in Deep Learning: Training Large 3D CNNs with Hybrid Parallelism
Yosuke Oyama
N. Maruyama
Nikoli Dryden
Erin McCarthy
P. Harrington
J. Balewski
Satoshi Matsuoka
Peter Nugent
B. Van Essen
3DV
AI4CE
24
37
0
25 Jul 2020
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
281
2,888
0
15 Sep 2016
1