Polynomial Time and Private Learning of Unbounded Gaussian Mixture
Models
International Conference on Machine Learning (ICML), 2023
Abstract
We study the problem of privately estimating the parameters of -dimensional Gaussian Mixture Models (GMMs) with components. For this, we develop a technique to reduce the problem to its non-private counterpart. This allows us to privatize existing non-private algorithms in a blackbox manner, while incurring only a small overhead in the sample complexity and running time. As the main application of our framework, we develop an -differentially private algorithm to learn GMMs using the non-private algorithm of Moitra and Valiant [MV10] as a blackbox. Consequently, this gives the first sample complexity upper bound and first polynomial time algorithm for privately learning GMMs without any boundedness assumptions on the parameters.
View on arXivComments on this paper
