Ivanov-Regularised Least-Squares Estimators over Large RKHSs and Their
Interpolation Spaces
We study kernel least-squares estimation under a norm constraint. This form of regularisation is known as Ivanov regularisation and it provides far better control of the regression function than the well-established Tikhonov regularisation. In particular, the smoothness and the scale of the regression function are directly controlled by the constraint. This choice of estimator also allows us to dispose of the standard assumption that the reproducing kernel Hilbert space (RKHS) has a Mercer kernel. The Mercer assumption is restrictive as it usually requires compactness of the covariate set. Instead, we make only the minimal assumption that the RKHS is separable with a bounded and measurable kernel. We provide rates of convergence for the expected squared error of our estimator under the weak assumption that the variance of the response variables is bounded and the unknown regression function lies in an interpolation space between and the RKHS. We complement this result with a high probability bound under the stronger assumption that the response variables have subgaussian errors and that the regression function is bounded. Finally, we derive an adaptive version of the high probability bound under the assumption that the response variables are bounded. The rates we achieve are close to the optimal rates attained under the stronger Mercer kernel assumption.
View on arXiv