ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1910.07476
  4. Cited By
Hidden Unit Specialization in Layered Neural Networks: ReLU vs.
  Sigmoidal Activation

Hidden Unit Specialization in Layered Neural Networks: ReLU vs. Sigmoidal Activation

16 October 2019
Elisa Oostwal
Michiel Straat
Michael Biehl
    MLT
ArXivPDFHTML

Papers citing "Hidden Unit Specialization in Layered Neural Networks: ReLU vs. Sigmoidal Activation"

2 / 2 papers shown
Title
Neural Redshift: Random Networks are not Random Functions
Neural Redshift: Random Networks are not Random Functions
Damien Teney
A. Nicolicioiu
Valentin Hartmann
Ehsan Abbasnejad
86
18
0
04 Mar 2024
Online Learning for the Random Feature Model in the Student-Teacher
  Framework
Online Learning for the Random Feature Model in the Student-Teacher Framework
Roman Worschech
B. Rosenow
23
0
0
24 Mar 2023
1