527

Sparse recovery with unknown variance: a LASSO-type approach

Abstract

We address the issue of estimating the regression vector β\beta and the variance \sg2\sg^{2} in the generic s-sparse linear model y=Xβ+zy = X\beta+z, with βRp\beta\in\R^{p}, yRny\in\R^{n}, zN(0,\sg2I)z\sim\mathcal N(0,\sg^2 I) and p>np> n. We propose a new LASSO-type method that jointly estimates β\beta, \sg2\sg^{2} and the relaxation parameter \lb\lb by imposing an explicit trade-off constraint between the log\log-likelihood and 1\ell_1-penalization terms. We prove that exact recovery of the support and sign pattern of β\beta holds with probability at least 1O(pα)1-O(p^{-\alpha}). Our assumptions, parametrized by α\alpha, are similar to the ones proposed in \cite{CandesPlan:AnnStat09} for \sg2\sg^{2} known. The proof relies on a tail decoupling argument with explicit constants and a recent version of the Non-Commutative Bernstein inequality \cite{Tropp:ArXiv10}. Our result is then derived from the optimality conditions for the estimators of β\beta and \lb\lb. Finally, a thorough analysis of the standard LASSO estimator as a function of \lb\lb allows us to construct an efficient Newton scheme for the fast computation of our estimators.

View on arXiv
Comments on this paper