Asymptotic Theory for Linear Functionals of Kernel Ridge Regression

An asymptotic theory is established for linear functionals of the predictive function given by kernel ridge regression, when the reproducing kernel Hilbert space is equivalent to a Sobolev space. The theory covers a wide variety of linear functionals, including point evaluations, evaluation of derivatives, inner products, etc. We establish the upper and lower bounds of the estimates and their asymptotic normality. It is shown that is the universal optimal order of magnitude for the smoothing parameter to balance the variance and the worst-case bias. The theory also implies that the optimal error of kernel ridge regression can be attained under the optimal smoothing parameter . These optimal rates for the smoothing parameter differ from the known optimal rate that minimizes the error of the kernel ridge regression.
View on arXiv