Sequential sampling for optimal weighted least squares approximations in
hierarchical spaces
We consider the problem of approximating an unknown function from its evaluations at given sampling points , where is a general domain and is a probability measure. The approximation is picked in a linear space where and computed by a weighted least squares method. Recent results show the advantages of picking the sampling points at random according to a well-chosen probability measure that depends both on and . With such a random design, the weighted least squares approximation is proved to be stable with high probability, and having precision comparable to that of the exact -orthonormal projection onto , in a near-linear sampling regime . The present paper is motivated by the adaptive approximation context, in which one typically generates a nested sequence of spaces with increasing dimension. Although the measure changes with , it is possible to recycle the previously generated samples by interpreting as a mixture between and an update measure . Based on this observation, we discuss sequential sampling algorithms that maintain the stability and approximation properties uniformly over all spaces . Our main result is that the total number of computed sample at step remains of the order with high probability. Numerical experiments confirm this analysis.
View on arXiv