ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1502.01988
24
61

Computational and Statistical Boundaries for Submatrix Localization in a Large Noisy Matrix

6 February 2015
T. Tony Cai
Tengyuan Liang
Alexander Rakhlin
ArXivPDFHTML
Abstract

The interplay between computational efficiency and statistical accuracy in high-dimensional inference has drawn increasing attention in the literature. In this paper, we study computational and statistical boundaries for submatrix localization. Given one observation of (one or multiple non-overlapping) signal submatrix (of magnitude λ\lambdaλ and size km×knk_m \times k_nkm​×kn​) contaminated with a noise matrix (of size m×nm \times nm×n), we establish two transition thresholds for the signal to noise λ/σ\lambda/\sigmaλ/σ ratio in terms of mmm, nnn, kmk_mkm​, and knk_nkn​. The first threshold, SNRc\sf SNR_cSNRc​, corresponds to the computational boundary. Below this threshold, it is shown that no polynomial time algorithm can succeed in identifying the submatrix, under the \textit{hidden clique hypothesis}. We introduce adaptive linear time spectral algorithms that identify the submatrix with high probability when the signal strength is above the threshold SNRc\sf SNR_cSNRc​. The second threshold, SNRs\sf SNR_sSNRs​, captures the statistical boundary, below which no method can succeed with probability going to one in the minimax sense. The exhaustive search method successfully finds the submatrix above this threshold. The results show an interesting phenomenon that SNRc\sf SNR_cSNRc​ is always significantly larger than SNRs\sf SNR_sSNRs​, which implies an essential gap between statistical optimality and computational efficiency for submatrix localization.

View on arXiv
Comments on this paper