18
0

On Learning Parallel Pancakes with Mostly Uniform Weights

Ilias Diakonikolas
Daniel M. Kane
Sushrut Karmalkar
Jasper C.H. Lee
Thanasis Pittas
Abstract

We study the complexity of learning kk-mixtures of Gaussians (kk-GMMs) on Rd\mathbb{R}^d. This task is known to have complexity dΩ(k)d^{\Omega(k)} in full generality. To circumvent this exponential lower bound on the number of components, research has focused on learning families of GMMs satisfying additional structural properties. A natural assumption posits that the component weights are not exponentially small and that the components have the same unknown covariance. Recent work gave a dO(log(1/wmin))d^{O(\log(1/w_{\min}))}-time algorithm for this class of GMMs, where wminw_{\min} is the minimum weight. Our first main result is a Statistical Query (SQ) lower bound showing that this quasi-polynomial upper bound is essentially best possible, even for the special case of uniform weights. Specifically, we show that it is SQ-hard to distinguish between such a mixture and the standard Gaussian. We further explore how the distribution of weights affects the complexity of this task. Our second main result is a quasi-polynomial upper bound for the aforementioned testing task when most of the weights are uniform while a small fraction of the weights are potentially arbitrary.

View on arXiv
@article{diakonikolas2025_2504.15251,
  title={ On Learning Parallel Pancakes with Mostly Uniform Weights },
  author={ Ilias Diakonikolas and Daniel M. Kane and Sushrut Karmalkar and Jasper C.H. Lee and Thanasis Pittas },
  journal={arXiv preprint arXiv:2504.15251},
  year={ 2025 }
}
Comments on this paper