ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1910.02270
  4. Cited By
Parallelizing Training of Deep Generative Models on Massive Scientific
  Datasets

Parallelizing Training of Deep Generative Models on Massive Scientific Datasets

5 October 2019
S. A. Jacobs
B. Van Essen
D. Hysom
Jae-Seung Yeom
Tim Moon
Rushil Anirudh
Jayaraman J. Thiagarajan
Shusen Liu
P. Bremer
J. Gaffney
Tom Benson
Peter B. Robinson
L. Peterson
B. Spears
    BDL
    AI4CE
ArXivPDFHTML

Papers citing "Parallelizing Training of Deep Generative Models on Massive Scientific Datasets"

3 / 3 papers shown
Title
SOLAR: A Highly Optimized Data Loading Framework for Distributed
  Training of CNN-based Scientific Surrogates
SOLAR: A Highly Optimized Data Loading Framework for Distributed Training of CNN-based Scientific Surrogates
Baixi Sun
Xiaodong Yu
Chengming Zhang
Jiannan Tian
Sian Jin
K. Iskra
Tao Zhou
Tekin Bicer
Pete Beckman
Dingwen Tao
16
1
0
01 Nov 2022
The Case for Strong Scaling in Deep Learning: Training Large 3D CNNs
  with Hybrid Parallelism
The Case for Strong Scaling in Deep Learning: Training Large 3D CNNs with Hybrid Parallelism
Yosuke Oyama
N. Maruyama
Nikoli Dryden
Erin McCarthy
P. Harrington
J. Balewski
Satoshi Matsuoka
Peter Nugent
B. Van Essen
3DV
AI4CE
24
37
0
25 Jul 2020
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
281
2,888
0
15 Sep 2016
1