Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1910.07476
Cited By
Hidden Unit Specialization in Layered Neural Networks: ReLU vs. Sigmoidal Activation
16 October 2019
Elisa Oostwal
Michiel Straat
Michael Biehl
MLT
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Hidden Unit Specialization in Layered Neural Networks: ReLU vs. Sigmoidal Activation"
1 / 1 papers shown
Title
Neural Redshift: Random Networks are not Random Functions
Damien Teney
A. Nicolicioiu
Valentin Hartmann
Ehsan Abbasnejad
75
18
0
04 Mar 2024
1