253

Asymptotic Analysis of LASSOs Solution Path with Implications for Approximate Message Passing

Richard G. Baraniuk
Abstract

This paper concerns the performance of the LASSO (also knows as basis pursuit denoising) for recovering sparse signals from undersampled, randomized, noisy measurements. We consider the recovery of the signal xoRNx_o \in \mathbb{R}^N from nn random and noisy linear observations y=Axo+wy= Ax_o + w, where AA is the measurement matrix and ww is the noise. The LASSO estimate is given by the solution to the optimization problem xox_o with x^λ=argminx12yAx22+λx1\hat{x}_{\lambda} = \arg \min_x \frac{1}{2} \|y-Ax\|_2^2 + \lambda \|x\|_1. Despite major progress in the theoretical analysis of the LASSO solution, little is known about its behavior as a function of the regularization parameter λ\lambda. In this paper we study two questions in the asymptotic setting (i.e., where NN \rightarrow \infty, nn \rightarrow \infty while the ratio n/Nn/N converges to a fixed number in (0,1)(0,1)): (i) How does the size of the active set x^λ0/N\|\hat{x}_\lambda\|_0/N behave as a function of λ\lambda, and (ii) How does the mean square error x^λxo22/N\|\hat{x}_{\lambda} - x_o\|_2^2/N behave as a function of λ\lambda? We then employ these results in a new, reliable algorithm for solving LASSO based on approximate message passing (AMP).

View on arXiv
Comments on this paper