ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1706.03678
215
11
v1v2v3v4 (latest)

Ivanov-Regularised Least-Squares Estimators over Large RKHSs and Their Interpolation Spaces

Journal of machine learning research (JMLR), 2017
12 June 2017
Stephen Page
Steffen Grunewalder
ArXiv (abs)PDFHTML
Abstract

We study kernel least-squares estimation under a norm constraint. This form of regularisation is known as Ivanov regularisation and it provides better control of the norm of the estimator than the well-established Tikhonov regularisation. This choice of regularisation allows us to dispose of the standard assumption that the reproducing kernel Hilbert space (RKHS) has a Mercer kernel, which is restrictive as it usually requires compactness of the covariate set. Instead, we assume only that the RKHS is separable with a bounded and measurable kernel. We provide rates of convergence for the expected squared L2L^2L2 error of our estimator under the weak assumption that the variance of the response variables is bounded and the unknown regression function lies in an interpolation space between L2L^2L2 and the RKHS. We then obtain faster rates of convergence when the regression function is bounded by clipping the estimator. In fact, we attain the optimal rate of convergence. Furthermore, we provide a high-probability bound under the stronger assumption that the response variables have subgaussian errors and that the regression function lies in an interpolation space between L∞L^\inftyL∞ and the RKHS. Finally, we derive adaptive results for the settings in which the regression function is bounded.

View on arXiv
Comments on this paper