ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.10101
60
0

Fundamental Limits of Learning High-dimensional Simplices in Noisy Regimes

11 June 2025
Seyed Amir Hossein Saberi
Amir Najafi
Abolfazl Motahari
B. Khalaj
ArXiv (abs)PDFHTML
Main:11 Pages
Bibliography:4 Pages
Appendix:29 Pages
Abstract

In this paper, we establish sample complexity bounds for learning high-dimensional simplices in RK\mathbb{R}^KRK from noisy data. Specifically, we consider nnn i.i.d. samples uniformly drawn from an unknown simplex in RK\mathbb{R}^KRK, each corrupted by additive Gaussian noise of unknown variance. We prove an algorithm exists that, with high probability, outputs a simplex within ℓ2\ell_2ℓ2​ or total variation (TV) distance at most ε\varepsilonε from the true simplex, provided n≥(K2/ε2)eO(K/SNR2)n \ge (K^2/\varepsilon^2) e^{\mathcal{O}(K/\mathrm{SNR}^2)}n≥(K2/ε2)eO(K/SNR2), where SNR\mathrm{SNR}SNR is the signal-to-noise ratio. Extending our prior work~\citep{saberi2023sample}, we derive new information-theoretic lower bounds, showing that simplex estimation within TV distance ε\varepsilonε requires at least n≥Ω(K3σ2/ε2+K/ε)n \ge \Omega(K^3 \sigma^2/\varepsilon^2 + K/\varepsilon)n≥Ω(K3σ2/ε2+K/ε) samples, where σ2\sigma^2σ2 denotes the noise variance. In the noiseless scenario, our lower bound n≥Ω(K/ε)n \ge \Omega(K/\varepsilon)n≥Ω(K/ε) matches known upper bounds up to constant factors. We resolve an open question by demonstrating that when SNR≥Ω(K1/2)\mathrm{SNR} \ge \Omega(K^{1/2})SNR≥Ω(K1/2), noisy-case complexity aligns with the noiseless case. Our analysis leverages sample compression techniques (Ashtiani et al., 2018) and introduces a novel Fourier-based method for recovering distributions from noisy observations, potentially applicable beyond simplex learning.

View on arXiv
@article{saberi2025_2506.10101,
  title={ Fundamental Limits of Learning High-dimensional Simplices in Noisy Regimes },
  author={ Seyed Amir Hossein Saberi and Amir Najafi and Abolfazl Motahari and Babak H. khalaj },
  journal={arXiv preprint arXiv:2506.10101},
  year={ 2025 }
}
Comments on this paper