11
0

Kernelized Normalizing Constant Estimation: Bridging Bayesian Quadrature and Bayesian Optimization

Abstract

In this paper, we study the problem of estimating the normalizing constant eλf(x)dx\int e^{-\lambda f(x)}dx through queries to the black-box function ff, where ff belongs to a reproducing kernel Hilbert space (RKHS), and λ\lambda is a problem parameter. We show that to estimate the normalizing constant within a small relative error, the level of difficulty depends on the value of λ\lambda: When λ\lambda approaches zero, the problem is similar to Bayesian quadrature (BQ), while when λ\lambda approaches infinity, the problem is similar to Bayesian optimization (BO). More generally, the problem varies between BQ and BO. We find that this pattern holds true even when the function evaluations are noisy, bringing new aspects to this topic. Our findings are supported by both algorithm-independent lower bounds and algorithmic upper bounds, as well as simulation studies conducted on a variety of benchmark functions.

View on arXiv
Comments on this paper