Variable selection in high-dimensional additive models based on norms of
projections
Abstract
We consider the problem of variable selection in high-dimensional sparse additive models. The proposed method is motivated by geometric considerations in Hilbert spaces, and consists in comparing the norms of the projections of the data on various additive subspaces. Our main results are concentration inequalities which lead to conditions making variable selection possible. In special cases these conditions are known to be optimal. As an application we consider the problem of estimating single components. We show that, up to first order, one can estimate a single component as well as if the other components were known.
View on arXivComments on this paper
