All Papers
Title |
---|
Title |
---|
This paper is concerned with the approximation of a function in a given approximation space of dimension from evaluations of the function at suitably chosen points. The aim is to construct an approximation of in which yields an error close to the best approximation error in and using as few evaluations as possible. Classical least-squares regression, which defines a projection in from random points, usually requires a large to guarantee a stable approximation and an error close to the best approximation error. This is a major drawback for applications where is expensive to evaluate. One remedy is to use a weighted least squares projection using samples drawn from a properly selected distribution. In this paper, we introduce a boosted weighted least-squares method which allows to ensure almost surely the stability of the weighted least squares projection with a sample size close to the interpolation regime . It consists in sampling according to a measure associated with the optimization of a stability criterion over a collection of independent -samples, and resampling according to this measure until a stability condition is satisfied. A greedy method is then proposed to remove points from the obtained sample. Quasi-optimality properties are obtained for the weighted least-squares projection, with or without the greedy procedure. The proposed method is validated on numerical examples and compared to state-of-the-art interpolation and weighted least squares methods.
View on arXiv