ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.08044
  4. Cited By
How many Neurons do we need? A refined Analysis for Shallow Networks
  trained with Gradient Descent

How many Neurons do we need? A refined Analysis for Shallow Networks trained with Gradient Descent

14 September 2023
Mike Nguyen
Nicole Mücke
    MLT
ArXivPDFHTML

Papers citing "How many Neurons do we need? A refined Analysis for Shallow Networks trained with Gradient Descent"

4 / 4 papers shown
Title
Neuroplasticity in Artificial Intelligence -- An Overview and Inspirations on Drop In & Out Learning
Neuroplasticity in Artificial Intelligence -- An Overview and Inspirations on Drop In & Out Learning
Yupei Li
M. Milling
Björn Schuller
AI4CE
107
0
0
27 Mar 2025
Finite Samples for Shallow Neural Networks
Finite Samples for Shallow Neural Networks
Yu Xia
Zhiqiang Xu
43
0
0
17 Mar 2025
Stochastic Gradient Descent for Two-layer Neural Networks
Stochastic Gradient Descent for Two-layer Neural Networks
Dinghao Cao
Zheng-Chu Guo
Lei Shi
MLT
19
0
0
10 Jul 2024
Empirical Risk Minimization in the Interpolating Regime with Application
  to Neural Network Learning
Empirical Risk Minimization in the Interpolating Regime with Application to Neural Network Learning
Nicole Mücke
Ingo Steinwart
AI4CE
14
2
0
25 May 2019
1