ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2202.03348
23
12

Failure and success of the spectral bias prediction for Kernel Ridge Regression: the case of low-dimensional data

7 February 2022
Umberto M. Tomasini
Antonio Sclocchi
M. Wyart
ArXivPDFHTML
Abstract

Recently, several theories including the replica method made predictions for the generalization error of Kernel Ridge Regression. In some regimes, they predict that the method has a `spectral bias': decomposing the true function f∗f^*f∗ on the eigenbasis of the kernel, it fits well the coefficients associated with the O(P) largest eigenvalues, where PPP is the size of the training set. This prediction works very well on benchmark data sets such as images, yet the assumptions these approaches make on the data are never satisfied in practice. To clarify when the spectral bias prediction holds, we first focus on a one-dimensional model where rigorous results are obtained and then use scaling arguments to generalize and test our findings in higher dimensions. Our predictions include the classification case f(x)=f(x)=f(x)=sign(x1)(x_1)(x1​) with a data distribution that vanishes at the decision boundary p(x)∼x1χp(x)\sim x_1^{\chi}p(x)∼x1χ​. For χ>0\chi>0χ>0 and a Laplace kernel, we find that (i) there exists a cross-over ridge λd,χ∗(P)∼P−1d+χ\lambda^*_{d,\chi}(P)\sim P^{-\frac{1}{d+\chi}}λd,χ∗​(P)∼P−d+χ1​ such that for λ≫λd,χ∗(P)\lambda\gg \lambda^*_{d,\chi}(P)λ≫λd,χ∗​(P), the replica method applies, but not for λ≪λd,χ∗(P)\lambda\ll\lambda^*_{d,\chi}(P)λ≪λd,χ∗​(P), (ii) in the ridge-less case, spectral bias predicts the correct training curve exponent only in the limit d→∞d\rightarrow\inftyd→∞.

View on arXiv
Comments on this paper