ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 0706.0534
496
48
v1v2 (latest)

Compressed Regression

4 June 2007
Shuheng Zhou
John D. Lafferty
Larry A. Wasserman
ArXiv (abs)PDFHTML
Abstract

Recent research has studied the role of sparsity in high dimensional regression and signal reconstruction, establishing theoretical limits for recovering sparse models from sparse data. This line of work shows that ℓ1\ell_1ℓ1​-regularized least squares regression can accurately estimate a sparse linear model from nnn noisy examples in ppp dimensions, even if ppp is much larger than nnn. In this paper we study a variant of this problem where the original nnn input variables are compressed by a random linear transformation to m≪nm \ll nm≪n examples in ppp dimensions, and establish conditions under which a sparse linear model can be successfully recovered from the compressed data. A primary motivation for this compression procedure is to anonymize the data and preserve privacy by revealing little information about the original data. We characterize the number of random projections that are required for ℓ1\ell_1ℓ1​-regularized compressed regression to identify the nonzero coefficients in the true model with probability approaching one, a property called ``sparsistence.'' In addition, we show that ℓ1\ell_1ℓ1​-regularized compressed regression asymptotically predicts as well as an oracle linear model, a property called ``persistence.'' Finally, we characterize the privacy properties of the compression procedure in information-theoretic terms, establishing upper bounds on the mutual information between the compressed and uncompressed data that decay to zero.

View on arXiv
Comments on this paper