411

Kernel regression in high dimension: Refined analysis beyond double descent

Abstract

In this paper, we provide a precise characterize of generalization properties of high dimensional kernel ridge regression across the under- and over-parameterized regimes, depending on whether the number of training data nn exceeds the feature dimension dd. By establishing a novel bias-variance decomposition of the expected excess risk, we show that, while the bias is independent of dd and monotonically decreases with nn, the variance depends on n,dn,d and can be unimodal or monotonically decreasing under different regularization schemes. Our refined analysis goes beyond the double descent theory by showing that, depending on the data eigen-profile and the level of regularization, the kernel regression risk curve can be a double-descent-like, bell-shaped, or monotonic function of nn. Experiments on synthetic and real data are conducted to support our theoretical findings.

View on arXiv
Comments on this paper