We consider the problem of recovery of an unknown multivariate signal observed in a -dimensional Gaussian white noise model of intensity . We assume that belongs to a class of smooth functions and has an additive sparse structure determined by the parameter , the number of non-zero univariate components contributing to . We are interested in the case when as and the parameter stays "small" relative to . With these assumptions, the recovery problem in hand becomes that of determining which sparse additive components are non-zero. Attempting to reconstruct most non-zero components of , but not all of them, we arrive at the problem of almost full variable selection in high-dimensional regression. For two different choices of , we establish conditions under which almost full variable selection is possible, and provide a procedure that gives almost full variable selection. The procedure does the best (in the asymptotically minimax sense) in selecting most non-zero components of . Moreover, it is adaptive in the parameter .
View on arXiv