ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2107.05341
  4. Cited By
v1v2v3 (latest)

Nonparametric Regression with Shallow Overparameterized Neural Networks Trained by GD with Early Stopping

12 July 2021
Ilja Kuzborskij
Csaba Szepesvári
ArXiv (abs)PDFHTML

Papers citing "Nonparametric Regression with Shallow Overparameterized Neural Networks Trained by GD with Early Stopping"

6 / 6 papers shown
Gradient Descent Finds Over-Parameterized Neural Networks with Sharp Generalization for Nonparametric Regression
Gradient Descent Finds Over-Parameterized Neural Networks with Sharp Generalization for Nonparametric Regression
Yingzhen Yang
Ping Li
MLT
598
1
0
05 Nov 2024
Neural Networks Efficiently Learn Low-Dimensional Representations with
  SGD
Neural Networks Efficiently Learn Low-Dimensional Representations with SGDInternational Conference on Learning Representations (ICLR), 2022
Alireza Mousavi-Hosseini
Sejun Park
M. Girotti
Ioannis Mitliagkas
Murat A. Erdogdu
MLT
574
62
0
29 Sep 2022
Towards Data-Algorithm Dependent Generalization: a Case Study on
  Overparameterized Linear Regression
Towards Data-Algorithm Dependent Generalization: a Case Study on Overparameterized Linear RegressionNeural Information Processing Systems (NeurIPS), 2022
Jing Xu
Jiaye Teng
Yang Yuan
Andrew Chi-Chih Yao
405
3
0
12 Feb 2022
Stability & Generalisation of Gradient Descent for Shallow Neural
  Networks without the Neural Tangent Kernel
Stability & Generalisation of Gradient Descent for Shallow Neural Networks without the Neural Tangent KernelNeural Information Processing Systems (NeurIPS), 2021
Dominic Richards
Ilja Kuzborskij
182
33
0
27 Jul 2021
On the Role of Optimization in Double Descent: A Least Squares Study
On the Role of Optimization in Double Descent: A Least Squares StudyNeural Information Processing Systems (NeurIPS), 2021
Ilja Kuzborskij
Csaba Szepesvári
Omar Rivasplata
Amal Rannen-Triki
Razvan Pascanu
123
10
0
27 Jul 2021
Generalization of GANs and overparameterized models under Lipschitz
  continuity
Generalization of GANs and overparameterized models under Lipschitz continuity
Khoat Than
Nghia D. Vu
AI4CE
253
2
0
06 Apr 2021
1