Near-Optimal Approximations for Bayesian Inference in Function Space
We propose a scalable inference algorithm for Bayes posteriors defined on a reproducing kernel Hilbert space (RKHS). Given a likelihood function and a Gaussian random element representing the prior, the corresponding Bayes posterior measure can be obtained as the stationary distribution of an RKHS-valued Langevin diffusion. We approximate the infinite-dimensional Langevin diffusion via a projection onto the first components of the Kosambi-Karhunen-Loève expansion. Exploiting the thus obtained approximate posterior for these components, we perform inference for by relying on the law of total probability and a sufficiency assumption. The resulting method scales as , where is the number of samples produced from the posterior measure . Interestingly, the algorithm recovers the posterior arising from the sparse variational Gaussian process (SVGP) (see Titsias, 2009) as a special case, owed to the fact that the sufficiency assumption underlies both methods. However, whereas the SVGP is parametrically constrained to be a Gaussian process, our method is based on a non-parametric variational family consisting of all probability measures on . As a result, our method is provably close to the optimal -dimensional variational approximation of the Bayes posterior in for convex and Lipschitz continuous negative log likelihoods, and coincides with SVGP for the special case of a Gaussian error likelihood.
View on arXiv