ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2205.13527
28
1

Subspace clustering in high-dimensions: Phase transitions & Statistical-to-Computational gap

26 May 2022
Luca Pesce
Bruno Loureiro
Florent Krzakala
Lenka Zdeborová
ArXivPDFHTML
Abstract

A simple model to study subspace clustering is the high-dimensional kkk-Gaussian mixture model where the cluster means are sparse vectors. Here we provide an exact asymptotic characterization of the statistically optimal reconstruction error in this model in the high-dimensional regime with extensive sparsity, i.e. when the fraction of non-zero components of the cluster means ρ\rhoρ, as well as the ratio α\alphaα between the number of samples and the dimension are fixed, while the dimension diverges. We identify the information-theoretic threshold below which obtaining a positive correlation with the true cluster means is statistically impossible. Additionally, we investigate the performance of the approximate message passing (AMP) algorithm analyzed via its state evolution, which is conjectured to be optimal among polynomial algorithm for this task. We identify in particular the existence of a statistical-to-computational gap between the algorithm that require a signal-to-noise ratio λalg≥k/α\lambda_{\text{alg}} \ge k / \sqrt{\alpha} λalg​≥k/α​ to perform better than random, and the information theoretic threshold at λit≈−kρlog⁡ρ/α\lambda_{\text{it}} \approx \sqrt{-k \rho \log{\rho}} / \sqrt{\alpha}λit​≈−kρlogρ​/α​. Finally, we discuss the case of sub-extensive sparsity ρ\rhoρ by comparing the performance of the AMP with other sparsity-enhancing algorithms, such as sparse-PCA and diagonal thresholding.

View on arXiv
Comments on this paper