91

Bayesian quadrature for H1(μ)H^1(μ) with Poincaré inequality on a compact interval

Abstract

Motivated by uncertainty quantification of complex systems, we aim at finding quadrature formulas of the form abf(x)dμ(x)=i=1nwif(xi)\int_a^b f(x) d\mu(x) = \sum_{i=1}^n w_i f(x_i) where ff belongs to H1(μ)H^1(\mu). Here, μ\mu belongs to a class of continuous probability distributions on [a,b]R[a, b] \subset \mathbb{R} and i=1nwiδxi\sum_{i=1}^n w_i \delta_{x_i} is a discrete probability distribution on [a,b][a, b]. We show that H1(μ)H^1(\mu) is a reproducing kernel Hilbert space with a continuous kernel KK, which allows to reformulate the quadrature question as a Bayesian (or kernel) quadrature problem. Although KK has not an easy closed form in general, we establish a correspondence between its spectral decomposition and the one associated to Poincar\é inequalities, whose common eigenfunctions form a TT-system (Karlin and Studden, 1966). The quadrature problem can then be solved in the finite-dimensional proxy space spanned by the first eigenfunctions. The solution is given by a generalized Gaussian quadrature, which we call Poincar\é quadrature. We derive several results for the Poincar\é quadrature weights and the associated worst-case error. When μ\mu is the uniform distribution, the results are explicit: the Poincar\é quadrature is equivalent to the midpoint (rectangle) quadrature rule. Its nodes coincide with the zeros of an eigenfunction and the worst-case error scales as ba23n1\frac{b-a}{2\sqrt{3}}n^{-1} for large nn. By comparison with known results for H1(0,1)H^1(0,1), this shows that the Poincar\é quadrature is asymptotically optimal. For a general μ\mu, we provide an efficient numerical procedure, based on finite elements and linear programming. Numerical experiments provide useful insights: nodes are nearly evenly spaced, weights are close to the probability density at nodes, and the worst-case error is approximately O(n1)O(n^{-1}) for large nn.

View on arXiv
Comments on this paper