ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1903.02237
  4. Cited By
Positively Scale-Invariant Flatness of ReLU Neural Networks

Positively Scale-Invariant Flatness of ReLU Neural Networks

6 March 2019
Mingyang Yi
Qi Meng
Wei-neng Chen
Zhi-Ming Ma
Tie-Yan Liu
ArXivPDFHTML

Papers citing "Positively Scale-Invariant Flatness of ReLU Neural Networks"

5 / 5 papers shown
Title
Local Identifiability of Deep ReLU Neural Networks: the Theory
Local Identifiability of Deep ReLU Neural Networks: the Theory
Joachim Bona-Pellissier
Franccois Malgouyres
F. Bachoc
FAtt
67
6
0
15 Jun 2022
Understanding the Generalization Benefit of Normalization Layers:
  Sharpness Reduction
Understanding the Generalization Benefit of Normalization Layers: Sharpness Reduction
Kaifeng Lyu
Zhiyuan Li
Sanjeev Arora
FAtt
37
69
0
14 Jun 2022
An Embedding of ReLU Networks and an Analysis of their Identifiability
An Embedding of ReLU Networks and an Analysis of their Identifiability
Pierre Stock
Rémi Gribonval
28
17
0
20 Jul 2021
The Representation Theory of Neural Networks
The Representation Theory of Neural Networks
M. Armenta
Pierre-Marc Jodoin
19
30
0
23 Jul 2020
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
281
2,889
0
15 Sep 2016
1