ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.12184
  4. Cited By
A New Perspective for Understanding Generalization Gap of Deep Neural
  Networks Trained with Large Batch Sizes

A New Perspective for Understanding Generalization Gap of Deep Neural Networks Trained with Large Batch Sizes

21 October 2022
O. Oyedotun
Konstantinos Papadopoulos
Djamila Aouada
    AI4CE
ArXivPDFHTML

Papers citing "A New Perspective for Understanding Generalization Gap of Deep Neural Networks Trained with Large Batch Sizes"

2 / 2 papers shown
Title
Aggregated Residual Transformations for Deep Neural Networks
Aggregated Residual Transformations for Deep Neural Networks
Saining Xie
Ross B. Girshick
Piotr Dollár
Z. Tu
Kaiming He
261
10,196
0
16 Nov 2016
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
273
2,878
0
15 Sep 2016
1