123

Optimal weighted least-squares methods

Abstract

We consider the problem of reconstructing an unknown bounded function uu defined on a domain XRdX\subset \mathbb{R}^d from noiseless or noisy samples of uu at nn points (xi)i=1,,n(x^i)_{i=1,\dots,n}. We measure the reconstruction error in a norm L2(X,dρ)L^2(X,d\rho) for some given probability measure dρd\rho. Given a linear space VmV_m with dim(Vm)=mn{\rm dim}(V_m)=m\leq n, we study in general terms the weighted least-squares approximations from the spaces VmV_m based on independent random samples. The contribution of the present paper is twofold. From the theoretical perspective, we establish results in expectation and in probability for weighted least squares in general approximation spaces VmV_m. These results show that for an optimal choice of sampling measure dμd\mu and weight ww, which depends on the space VmV_m and on the measure dρd\rho, stability and optimal accuracy are achieved under the mild condition that nn scales linearly with mm up to an additional logarithmic factor. The present analysis covers also cases where the function uu and its approximants from VmV_m are unbounded, which might occur for instance in the relevant case where X=RdX=\mathbb{R}^d and dρd\rho is the Gaussian measure. From the numerical perspective, we propose a sampling method which allows one to generate independent and identically distributed samples from the optimal measure dμd\mu. This method becomes of interest in the multivariate setting where dμd\mu is generally not of tensor product type. We illustrate this for particular examples of approximation spaces VmV_m of polynomial type, where the domain XX is allowed to be unbounded and high or even infinite dimensional, motivated by certain applications to parametric and stochastic PDEs.

View on arXiv
Comments on this paper