A Fourier Approach to Mixture Learning

We revisit the problem of learning mixtures of spherical Gaussians. Given samples from mixture , the goal is to estimate the means up to a small error. The hardness of this learning problem can be measured by the separation defined as the minimum distance between all pairs of means. Regev and Vijayaraghavan (2017) showed that with separation, the means can be learned using samples, whereas super-polynomially many samples are required if and . This leaves open the low-dimensional regime where . In this work, we give an algorithm that efficiently learns the means in dimensions under separation (modulo doubly logarithmic factors). This separation is strictly smaller than , and is also shown to be necessary. Along with the results of Regev and Vijayaraghavan (2017), our work almost pins down the critical separation threshold at which efficient parameter learning becomes possible for spherical Gaussian mixtures. More generally, our algorithm runs in time , and is thus fixed-parameter tractable in parameters , and . Our approach is based on estimating the Fourier transform of the mixture at carefully chosen frequencies, and both the algorithm and its analysis are simple and elementary. Our positive results can be easily extended to learning mixtures of non-Gaussian distributions, under a mild condition on the Fourier spectrum of the distribution.
View on arXiv