ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1508.06660
20
12

Adaptive variable selection in nonparametric sparse additive models

26 August 2015
C. Butucea
N. Stepanova
ArXivPDFHTML
Abstract

We consider the problem of recovery of an unknown multivariate signal fff observed in a ddd-dimensional Gaussian white noise model of intensity ε\varepsilonε. We assume that fff belongs to a class of smooth functions Fd⊂L2([0,1]d){\cal F}^d\subset L_2([0,1]^d)Fd⊂L2​([0,1]d) and has an additive sparse structure determined by the parameter sss, the number of non-zero univariate components contributing to fff. We are interested in the case when d=dε→∞d=d_\varepsilon \to \inftyd=dε​→∞ as ε→0\varepsilon \to 0ε→0 and the parameter sss stays "small" relative to ddd. With these assumptions, the recovery problem in hand becomes that of determining which sparse additive components are non-zero. Attempting to reconstruct most non-zero components of fff, but not all of them, we arrive at the problem of almost full variable selection in high-dimensional regression. For two different choices of Fd{\cal F}^dFd, we establish conditions under which almost full variable selection is possible, and provide a procedure that gives almost full variable selection. The procedure does the best (in the asymptotically minimax sense) in selecting most non-zero components of fff. Moreover, it is adaptive in the parameter sss.

View on arXiv
Comments on this paper