Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1903.02237
Cited By
Positively Scale-Invariant Flatness of ReLU Neural Networks
6 March 2019
Mingyang Yi
Qi Meng
Wei-neng Chen
Zhi-Ming Ma
Tie-Yan Liu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Positively Scale-Invariant Flatness of ReLU Neural Networks"
5 / 5 papers shown
Title
Local Identifiability of Deep ReLU Neural Networks: the Theory
Joachim Bona-Pellissier
Franccois Malgouyres
F. Bachoc
FAtt
67
6
0
15 Jun 2022
Understanding the Generalization Benefit of Normalization Layers: Sharpness Reduction
Kaifeng Lyu
Zhiyuan Li
Sanjeev Arora
FAtt
37
69
0
14 Jun 2022
An Embedding of ReLU Networks and an Analysis of their Identifiability
Pierre Stock
Rémi Gribonval
28
17
0
20 Jul 2021
The Representation Theory of Neural Networks
M. Armenta
Pierre-Marc Jodoin
19
30
0
23 Jul 2020
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
281
2,889
0
15 Sep 2016
1