Selection of variables and dimension reduction in high-dimensional non-parametric regression

We consider a -penalization procedure in the non-parametric Gaussian regression model. In many concrete examples, the dimension of the input variable is very large (sometimes depending on the number of observations). Estimation of a -regular regression function cannot be faster than the slow rate . Hopefully, in some situations, depends only on a few numbers of the coordinates of . In this paper, we construct two procedures. The first one selects, with high probability, these coordinates. Then, using this subset selection method, we run a local polynomial estimator (on the set of interesting coordinates) to estimate the regression function at the rate , where , the "real" dimension of the problem (exact number of variables whom depends on), has replaced the dimension of the design. To achieve this result, we used a penalization method in this non-parametric setup.
View on arXiv