ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1605.08325
  4. Cited By
Theano-MPI: a Theano-based Distributed Training Framework

Theano-MPI: a Theano-based Distributed Training Framework

26 May 2016
He Ma
Fei Mao
Graham W. Taylor
    GNN
ArXivPDFHTML

Papers citing "Theano-MPI: a Theano-based Distributed Training Framework"

5 / 5 papers shown
Title
Characterizing Deep-Learning I/O Workloads in TensorFlow
Characterizing Deep-Learning I/O Workloads in TensorFlow
Steven W. D. Chien
Stefano Markidis
C. Sishtla
Luís Santos
Pawel Herman
Sai B. Narasimhamurthy
Erwin Laure
21
50
0
06 Oct 2018
Empirical Evaluation of Parallel Training Algorithms on Acoustic
  Modeling
Empirical Evaluation of Parallel Training Algorithms on Acoustic Modeling
Wenpeng Li
BinBin Zhang
Lei Xie
Dong Yu
24
5
0
17 Mar 2017
Exploring the Design Space of Deep Convolutional Neural Networks at
  Large Scale
Exploring the Design Space of Deep Convolutional Neural Networks at Large Scale
F. Iandola
3DV
26
18
0
20 Dec 2016
Generative Adversarial Parallelization
Generative Adversarial Parallelization
Daniel Jiwoong Im
He Ma
C. Kim
Graham W. Taylor
GAN
30
38
0
13 Dec 2016
Distributed Training of Deep Neural Networks: Theoretical and Practical
  Limits of Parallel Scalability
Distributed Training of Deep Neural Networks: Theoretical and Practical Limits of Parallel Scalability
J. Keuper
Franz-Josef Pfreundt
GNN
55
97
0
22 Sep 2016
1