Optimal weighted least-squares methods

We consider the problem of reconstructing an unknown bounded function defined on a domain from noiseless or noisy samples of at points . We measure the reconstruction error in a norm for some given probability measure . Given a linear space with , we study in general terms the weighted least-squares approximations from the spaces based on independent random samples. The contribution of the present paper is twofold. From the theoretical perspective, we establish results in expectation and in probability for weighted least squares in general approximation spaces . These results show that for an optimal choice of sampling measure and weight , which depends on the space and on the measure , stability and optimal accuracy are achieved under the mild condition that scales linearly with up to an additional logarithmic factor. The present analysis covers also cases where the function and its approximants from are unbounded, which might occur for instance in the relevant case where and is the Gaussian measure. From the numerical perspective, we propose a sampling method which allows one to generate independent and identically distributed samples from the optimal measure . This method becomes of interest in the multivariate setting where is generally not of tensor product type. We illustrate this for particular examples of approximation spaces of polynomial type, where the domain is allowed to be unbounded and high or even infinite dimensional, motivated by certain applications to parametric and stochastic PDEs.
View on arXiv