ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1712.02029
  4. Cited By
AdaBatch: Adaptive Batch Sizes for Training Deep Neural Networks
v1v2 (latest)

AdaBatch: Adaptive Batch Sizes for Training Deep Neural Networks

6 December 2017
Aditya Devarakonda
Maxim Naumov
M. Garland
    ODL
ArXiv (abs)PDFHTML

Papers citing "AdaBatch: Adaptive Batch Sizes for Training Deep Neural Networks"

5 / 55 papers shown
Title
Highly Scalable Deep Learning Training System with Mixed-Precision:
  Training ImageNet in Four Minutes
Highly Scalable Deep Learning Training System with Mixed-Precision: Training ImageNet in Four Minutes
Chencan Wu
Shutao Song
W. He
Yangzihao Wang
Haidong Rong
...
Li Yu
Tiegang Chen
Guangxiao Hu
Shaoshuai Shi
Xiaowen Chu
198
413
0
30 Jul 2018
Semi-Dynamic Load Balancing: Efficient Distributed Learning in
  Non-Dedicated Environments
Semi-Dynamic Load Balancing: Efficient Distributed Learning in Non-Dedicated Environments
Chen Chen
Qizhen Weng
Wei Wang
Baochun Li
Bo Li
192
27
0
07 Jun 2018
A Progressive Batching L-BFGS Method for Machine Learning
A Progressive Batching L-BFGS Method for Machine Learning
Raghu Bollapragada
Dheevatsa Mudigere
J. Nocedal
Hao-Jun Michael Shi
P. T. P. Tang
ODL
198
165
0
15 Feb 2018
Parallel Complexity of Forward and Backward Propagation
Parallel Complexity of Forward and Backward Propagation
Maxim Naumov
153
8
0
18 Dec 2017
Parallelizing Word2Vec in Shared and Distributed Memory
Parallelizing Word2Vec in Shared and Distributed Memory
Shihao Ji
N. Satish
Sheng Li
Pradeep Dubey
VLMMoE
197
72
0
15 Apr 2016
Previous
12