ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2309.08044
  4. Cited By
How many Neurons do we need? A refined Analysis for Shallow Networks
  trained with Gradient Descent

How many Neurons do we need? A refined Analysis for Shallow Networks trained with Gradient Descent

Journal of Statistical Planning and Inference (JSPI), 2023
14 September 2023
Mike Nguyen
Nicole Mücke
    MLT
ArXiv (abs)PDFHTMLGithub

Papers citing "How many Neurons do we need? A refined Analysis for Shallow Networks trained with Gradient Descent"

3 / 3 papers shown
Neuroplasticity in Artificial Intelligence -- An Overview and Inspirations on Drop In & Out Learning
Neuroplasticity in Artificial Intelligence -- An Overview and Inspirations on Drop In & Out Learning
Yupei Li
M. Milling
Björn Schuller
AI4CE
698
5
0
27 Mar 2025
Finite Samples for Shallow Neural Networks
Finite Samples for Shallow Neural Networks
Yu Xia
Zhiqiang Xu
242
0
0
17 Mar 2025
Random feature approximation for general spectral methods
Random feature approximation for general spectral methods
Mike Nguyen
Nicole Mücke
238
1
0
29 Aug 2023
1
Page 1 of 1