ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2303.02118
20
4

Statistical-Computational Tradeoffs in Mixed Sparse Linear Regression

3 March 2023
Gabriel Arpino
R. Venkataramanan
ArXivPDFHTML
Abstract

We consider the problem of mixed sparse linear regression with two components, where two real kkk-sparse signals β1,β2\beta_1, \beta_2β1​,β2​ are to be recovered from nnn unlabelled noisy linear measurements. The sparsity is allowed to be sublinear in the dimension, and additive noise is assumed to be independent Gaussian with variance σ2\sigma^2σ2. Prior work has shown that the problem suffers from a kSNR2\frac{k}{SNR^2}SNR2k​-to-k2SNR2\frac{k^2}{SNR^2}SNR2k2​ statistical-to-computational gap, resembling other computationally challenging high-dimensional inference problems such as Sparse PCA and Robust Sparse Mean Estimation; here SNRSNRSNR is the signal-to-noise ratio. We establish the existence of a more extensive computational barrier for this problem through the method of low-degree polynomials, but show that the problem is computationally hard only in a very narrow symmetric parameter regime. We identify a smooth information-computation tradeoff between the sample complexity nnn and runtime for any randomized algorithm in this hard regime. Via a simple reduction, this provides novel rigorous evidence for the existence of a computational barrier to solving exact support recovery in sparse phase retrieval with sample complexity n=o~(k2)n = \tilde{o}(k^2)n=o~(k2). Our second contribution is to analyze a simple thresholding algorithm which, outside of the narrow regime where the problem is hard, solves the associated mixed regression detection problem in O(np)O(np)O(np) time with square-root the number of samples and matches the sample complexity required for (non-mixed) sparse linear regression; this allows the recovery problem to be subsequently solved by state-of-the-art techniques from the dense case. As a special case of our results, we show that this simple algorithm is order-optimal among a large family of algorithms in solving exact signed support recovery in sparse linear regression.

View on arXiv
Comments on this paper