ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2205.06798
36
15

Sharp Asymptotics of Kernel Ridge Regression Beyond the Linear Regime

13 May 2022
Hong Hu
Yue M. Lu
ArXivPDFHTML
Abstract

The generalization performance of kernel ridge regression (KRR) exhibits a multi-phased pattern that crucially depends on the scaling relationship between the sample size nnn and the underlying dimension ddd. This phenomenon is due to the fact that KRR sequentially learns functions of increasing complexity as the sample size increases; when dk−1≪n≪dkd^{k-1}\ll n\ll d^{k}dk−1≪n≪dk, only polynomials with degree less than kkk are learned. In this paper, we present sharp asymptotic characterization of the performance of KRR at the critical transition regions with n≍dkn \asymp d^kn≍dk, for k∈Z+k\in\mathbb{Z}^{+}k∈Z+. Our asymptotic characterization provides a precise picture of the whole learning process and clarifies the impact of various parameters (including the choice of the kernel function) on the generalization performance. In particular, we show that the learning curves of KRR can have a delicate "double descent" behavior due to specific bias-variance trade-offs at different polynomial scaling regimes.

View on arXiv
Comments on this paper