Order-Optimal Error Bounds for Noisy Kernel-Based Bayesian Quadrature

In this paper, we study the sample complexity of {\em noisy Bayesian quadrature} (BQ), in which we seek to approximate an integral based on noisy black-box queries to the underlying function. We consider functions in a {\em Reproducing Kernel Hilbert Space} (RKHS) with the Mat\'ern- kernel, focusing on combinations of the parameter and dimension such that the RKHS is equivalent to a Sobolev class. In this setting, we provide near-matching upper and lower bounds on the best possible average error. Specifically, we find that when the black-box queries are subject to Gaussian noise having variance , any algorithm making at most queries (even with adaptive sampling) must incur a mean absolute error of , and there exists a non-adaptive algorithm attaining an error of at most . Hence, the bounds are order-optimal, and establish that there is no adaptivity gap in terms of scaling laws.
View on arXiv