ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1511.08102
60
43
v1v2v3 (latest)

L1-Regularized Least Squares for Support Recovery of High Dimensional Single Index Models with Gaussian Designs

25 November 2015
Matey Neykov
Jun S. Liu
Tianxi Cai
ArXiv (abs)PDFHTML
Abstract

It is known that for a certain class of single index models (SIMs) Y=f(Xp×1⊺β0,ε)Y = f(\boldsymbol{X}_{p \times 1}^\intercal\boldsymbol{\beta}_0, \varepsilon)Y=f(Xp×1⊺​β0​,ε), support recovery is impossible when X∼N(0,Ip×p)\boldsymbol{X} \sim \mathcal{N}(0, \mathbb{I}_{p \times p})X∼N(0,Ip×p​) and a model complexity adjusted sample size is below a critical threshold. Recently, optimal algorithms based on Sliced Inverse Regression (SIR) were suggested. These algorithms work provably under the assumption that the design X\boldsymbol{X}X comes from an i.i.d. Gaussian distribution. In the present paper we analyze algorithms based on covariance screening and least squares with L1L_1L1​ penalization (i.e. LASSO) and demonstrate that they can also enjoy optimal (up to a scalar) rescaled sample size in terms of support recovery, albeit under slightly different assumptions on fff and ε\varepsilonε compared to the SIR based algorithms. Furthermore, we show more generally, that LASSO succeeds in recovering the signed support of β0\boldsymbol{\beta}_0β0​ if X∼N(0,Σ)\boldsymbol{X} \sim \mathcal{N}(0, \boldsymbol{\Sigma})X∼N(0,Σ), and the covariance Σ\boldsymbol{\Sigma}Σ satisfies the irrepresentable condition. Our work extends existing results on the support recovery of LASSO for the linear model, to a more general class of SIMs.

View on arXiv
Comments on this paper