Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1910.07476
Cited By
Hidden Unit Specialization in Layered Neural Networks: ReLU vs. Sigmoidal Activation
16 October 2019
Elisa Oostwal
Michiel Straat
Michael Biehl
MLT
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Hidden Unit Specialization in Layered Neural Networks: ReLU vs. Sigmoidal Activation"
2 / 2 papers shown
Title
Neural Redshift: Random Networks are not Random Functions
Damien Teney
A. Nicolicioiu
Valentin Hartmann
Ehsan Abbasnejad
86
18
0
04 Mar 2024
Online Learning for the Random Feature Model in the Student-Teacher Framework
Roman Worschech
B. Rosenow
23
0
0
24 Mar 2023
1