ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1103.1943
206
30

Compressed Sensing over ℓp\ell_pℓp​-balls: Minimax Mean Square Error

10 March 2011
D. Donoho
Iain M. Johnstone
A. Maleki
Andrea Montanari
ArXivPDFHTML
Abstract

We consider the compressed sensing problem, where the object x0∈\bRNx_0 \in \bR^Nx0​∈\bRN is to be recovered from incomplete measurements y=Ax0+zy = Ax_0 + zy=Ax0​+z; here the sensing matrix AAA is an n×Nn \times Nn×N random matrix with iid Gaussian entries and n<Nn < Nn<N. A popular method of sparsity-promoting reconstruction is ℓ1\ell^1ℓ1-penalized least-squares reconstruction (aka LASSO, Basis Pursuit). It is currently popular to consider the strict sparsity model, where the object x0x_0x0​ is nonzero in only a small fraction of entries. In this paper, we instead consider the much more broadly applicable ℓp\ell_pℓp​-sparsity model, where x0x_0x0​ is sparse in the sense of having ℓp\ell_pℓp​ norm bounded by ξ⋅N1/p\xi \cdot N^{1/p}ξ⋅N1/p for some fixed 0<p≤10 < p \leq 10<p≤1 and ξ>0\xi > 0ξ>0. We study an asymptotic regime in which nnn and NNN both tend to infinity with limiting ratio n/N=δ∈(0,1)n/N = \delta \in (0,1)n/N=δ∈(0,1), both in the noisy (z≠0z \neq 0z=0) and noiseless (z=0z=0z=0) cases. Under weak assumptions on x0x_0x0​, we are able to precisely evaluate the worst-case asymptotic minimax mean-squared reconstruction error (AMSE) for ℓ1\ell^1ℓ1 penalized least-squares: min over penalization parameters, max over ℓp\ell_pℓp​-sparse objects x0x_0x0​. We exhibit the asymptotically least-favorable object (hardest sparse signal to recover) and the maximin penalization. Our explicit formulas unexpectedly involve quantities appearing classically in statistical decision theory. Occurring in the present setting, they reflect a deeper connection between penalized ℓ1\ell^1ℓ1 minimization and scalar soft thresholding. This connection, which follows from earlier work of the authors and collaborators on the AMP iterative thresholding algorithm, is carefully explained. Our approach also gives precise results under weak-ℓp\ell_pℓp​ ball coefficient constraints, as we show here.

View on arXiv
Comments on this paper