ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2008.05621
  4. Cited By
The Slow Deterioration of the Generalization Error of the Random Feature
  Model

The Slow Deterioration of the Generalization Error of the Random Feature Model

Mathematical and Scientific Machine Learning (MSML), 2020
13 August 2020
Chao Ma
Lei Wu
E. Weinan
ArXiv (abs)PDFHTML

Papers citing "The Slow Deterioration of the Generalization Error of the Random Feature Model"

13 / 13 papers shown
A Mathematical Framework for Learning Probability Distributions
A Mathematical Framework for Learning Probability DistributionsJournal of Machine Learning (JML), 2022
Hongkang Yang
362
9
0
22 Dec 2022
Side Effects of Learning from Low-dimensional Data Embedded in a
  Euclidean Space
Side Effects of Learning from Low-dimensional Data Embedded in a Euclidean SpaceResearch in the Mathematical Sciences (Res. Math. Sci.), 2022
Juncai He
R. Tsai
Rachel A. Ward
628
9
0
01 Mar 2022
Overview frequency principle/spectral bias in deep learning
Overview frequency principle/spectral bias in deep learningCommunication on Applied Mathematics and Computation (CAMC), 2022
Z. Xu
Yaoyu Zhang
Yaoyu Zhang
FaML
509
137
0
19 Jan 2022
Subspace Decomposition based DNN algorithm for elliptic type multi-scale
  PDEs
Subspace Decomposition based DNN algorithm for elliptic type multi-scale PDEs
Xi-An Li
Z. Xu
Lei Zhang
271
34
0
10 Dec 2021
Conditioning of Random Feature Matrices: Double Descent and
  Generalization Error
Conditioning of Random Feature Matrices: Double Descent and Generalization Error
Zhijun Chen
Hayden Schaeffer
345
13
0
21 Oct 2021
Towards Understanding the Condensation of Neural Networks at Initial
  Training
Towards Understanding the Condensation of Neural Networks at Initial TrainingNeural Information Processing Systems (NeurIPS), 2021
Hanxu Zhou
Qixuan Zhou
Yaoyu Zhang
Yaoyu Zhang
Z. Xu
MLTAI4CE
426
34
0
25 May 2021
Frequency Principle in Deep Learning Beyond Gradient-descent-based
  Training
Frequency Principle in Deep Learning Beyond Gradient-descent-based Training
Yuheng Ma
Zhi-Qin John Xu
Jiwei Zhang
254
8
0
04 Jan 2021
Avoiding The Double Descent Phenomenon of Random Feature Models Using
  Hybrid Regularization
Avoiding The Double Descent Phenomenon of Random Feature Models Using Hybrid Regularization
Kelvin K. Kan
J. Nagy
Lars Ruthotto
AI4CE
185
6
0
11 Dec 2020
On the exact computation of linear frequency principle dynamics and its
  generalization
On the exact computation of linear frequency principle dynamics and its generalization
Yaoyu Zhang
Zheng Ma
Z. Xu
Yaoyu Zhang
235
24
0
15 Oct 2020
Towards a Mathematical Understanding of Neural Network-Based Machine
  Learning: what we know and what we don't
Towards a Mathematical Understanding of Neural Network-Based Machine Learning: what we know and what we don'tCSIAM Transactions on Applied Mathematics (CSIAM Trans. Appl. Math.), 2020
E. Weinan
Chao Ma
Stephan Wojtowytsch
Lei Wu
AI4CE
432
148
0
22 Sep 2020
How Powerful are Shallow Neural Networks with Bandlimited Random
  Weights?
How Powerful are Shallow Neural Networks with Bandlimited Random Weights?
Ming Li
Sho Sonoda
Feilong Cao
Yu Wang
Jiye Liang
366
10
0
19 Aug 2020
Deep frequency principle towards understanding why deeper learning is
  faster
Deep frequency principle towards understanding why deeper learning is fasterAAAI Conference on Artificial Intelligence (AAAI), 2020
Zhi-Qin John Xu
Hanxu Zhou
310
67
0
28 Jul 2020
The Quenching-Activation Behavior of the Gradient Descent Dynamics for
  Two-layer Neural Network Models
The Quenching-Activation Behavior of the Gradient Descent Dynamics for Two-layer Neural Network Models
Chao Ma
Lei Wu
E. Weinan
MLT
283
11
0
25 Jun 2020
1
Page 1 of 1