Optimal kernel regression bounds under energy-bounded noise

Non-conservative uncertainty bounds are key for both assessing an estimation algorithm's accuracy and in view of downstream tasks, such as its deployment in safety-critical contexts. In this paper, we derive a tight, non-asymptotic uncertainty bound for kernel-based estimation, which can also handle correlated noise sequences. Its computation relies on a mild norm-boundedness assumption on the unknown function and the noise, returning the worst-case function realization within the hypothesis class at an arbitrary query input location. The value of this function is shown to be given in terms of the posterior mean and covariance of a Gaussian process for an optimal choice of the measurement noise covariance. By rigorously analyzing the proposed approach and comparing it with other results in the literature, we show its effectiveness in returning tight and easy-to-compute bounds for kernel-based estimates.
View on arXiv@article{lahr2025_2505.22235, title={ Optimal kernel regression bounds under energy-bounded noise }, author={ Amon Lahr and Johannes Köhler and Anna Scampicchio and Melanie N. Zeilinger }, journal={arXiv preprint arXiv:2505.22235}, year={ 2025 } }