Information-Theoretic Guarantees for Recovering Low-Rank Tensors from Symmetric Rank-One Measurements

In this paper, we investigate the sample complexity of recovering tensors with low symmetric rank from symmetric rank-one measurements. This setting is particularly motivated by the study of higher-order interactions and the analysis of two-layer neural networks with polynomial activations (polynomial networks). Using a covering numbers argument, we analyze the performance of the symmetric rank minimization program and establish near-optimal sample complexity bounds when the underlying distribution is log-concave. Our measurement model involves random symmetric rank-one tensors, which lead to involved probability calculations. To address these challenges, we employ the Carbery-Wright inequality, a powerful tool for studying anti-concentration properties of random polynomials, and leverage orthogonal polynomials. Additionally, we provide a sample complexity lower bound based on Fano's inequality, and discuss broader implications of our results for two-layer polynomial networks.
View on arXiv@article{kızıldağ2025_2502.05134, title={ Information-Theoretic Guarantees for Recovering Low-Rank Tensors from Symmetric Rank-One Measurements }, author={ Eren C. Kızıldağ }, journal={arXiv preprint arXiv:2502.05134}, year={ 2025 } }