On Learning Parallel Pancakes with Mostly Uniform Weights

We study the complexity of learning -mixtures of Gaussians (-GMMs) on . This task is known to have complexity in full generality. To circumvent this exponential lower bound on the number of components, research has focused on learning families of GMMs satisfying additional structural properties. A natural assumption posits that the component weights are not exponentially small and that the components have the same unknown covariance. Recent work gave a -time algorithm for this class of GMMs, where is the minimum weight. Our first main result is a Statistical Query (SQ) lower bound showing that this quasi-polynomial upper bound is essentially best possible, even for the special case of uniform weights. Specifically, we show that it is SQ-hard to distinguish between such a mixture and the standard Gaussian. We further explore how the distribution of weights affects the complexity of this task. Our second main result is a quasi-polynomial upper bound for the aforementioned testing task when most of the weights are uniform while a small fraction of the weights are potentially arbitrary.
View on arXiv@article{diakonikolas2025_2504.15251, title={ On Learning Parallel Pancakes with Mostly Uniform Weights }, author={ Ilias Diakonikolas and Daniel M. Kane and Sushrut Karmalkar and Jasper C.H. Lee and Thanasis Pittas }, journal={arXiv preprint arXiv:2504.15251}, year={ 2025 } }