ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1710.09553
  4. Cited By
Rethinking generalization requires revisiting old ideas: statistical
  mechanics approaches and complex learning behavior

Rethinking generalization requires revisiting old ideas: statistical mechanics approaches and complex learning behavior

26 October 2017
Charles H. Martin
Michael W. Mahoney
    AI4CE
ArXivPDFHTML

Papers citing "Rethinking generalization requires revisiting old ideas: statistical mechanics approaches and complex learning behavior"

12 / 12 papers shown
Title
A Model Zoo on Phase Transitions in Neural Networks
A Model Zoo on Phase Transitions in Neural Networks
Konstantin Schurholt
Léo Meynent
Yefan Zhou
Haiquan Lu
Yaoqing Yang
Damian Borth
63
0
0
25 Apr 2025
On the Overlooked Structure of Stochastic Gradients
On the Overlooked Structure of Stochastic Gradients
Zeke Xie
Qian-Yuan Tang
Mingming Sun
P. Li
23
6
0
05 Dec 2022
On Random Matrices Arising in Deep Neural Networks: General I.I.D. Case
On Random Matrices Arising in Deep Neural Networks: General I.I.D. Case
L. Pastur
V. Slavin
CML
14
12
0
20 Nov 2020
Towards Efficient Training for Neural Network Quantization
Towards Efficient Training for Neural Network Quantization
Qing Jin
Linjie Yang
Zhenyu A. Liao
MQ
11
42
0
21 Dec 2019
Deep learning with noisy labels: exploring techniques and remedies in
  medical image analysis
Deep learning with noisy labels: exploring techniques and remedies in medical image analysis
Davood Karimi
Haoran Dou
Simon K. Warfield
Ali Gholipour
NoLa
11
534
0
05 Dec 2019
Parameter Re-Initialization through Cyclical Batch Size Schedules
Parameter Re-Initialization through Cyclical Batch Size Schedules
Norman Mu
Z. Yao
A. Gholami
Kurt Keutzer
Michael W. Mahoney
ODL
16
8
0
04 Dec 2018
Implicit Self-Regularization in Deep Neural Networks: Evidence from
  Random Matrix Theory and Implications for Learning
Implicit Self-Regularization in Deep Neural Networks: Evidence from Random Matrix Theory and Implications for Learning
Charles H. Martin
Michael W. Mahoney
AI4CE
16
190
0
02 Oct 2018
The committee machine: Computational to statistical gaps in learning a
  two-layers neural network
The committee machine: Computational to statistical gaps in learning a two-layers neural network
Benjamin Aubin
Antoine Maillard
Jean Barbier
Florent Krzakala
N. Macris
Lenka Zdeborová
30
103
0
14 Jun 2018
Invariance of Weight Distributions in Rectified MLPs
Invariance of Weight Distributions in Rectified MLPs
Russell Tsuchida
Farbod Roosta-Khorasani
M. Gallagher
MLT
16
35
0
24 Nov 2017
Optimal Errors and Phase Transitions in High-Dimensional Generalized
  Linear Models
Optimal Errors and Phase Transitions in High-Dimensional Generalized Linear Models
Jean Barbier
Florent Krzakala
N. Macris
Léo Miolane
Lenka Zdeborová
20
258
0
10 Aug 2017
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
273
2,886
0
15 Sep 2016
The Loss Surfaces of Multilayer Networks
The Loss Surfaces of Multilayer Networks
A. Choromańska
Mikael Henaff
Michaël Mathieu
Gerard Ben Arous
Yann LeCun
ODL
175
1,185
0
30 Nov 2014
1