We introduce general tools for designing efficient private estimation algorithms, in the high-dimensional settings, whose statistical guarantees almost match those of the best known non-private algorithms. To illustrate our techniques, we consider two problems: recovery of stochastic block models and learning mixtures of spherical Gaussians. For the former, we present the first efficient -differentially private algorithm for both weak recovery and exact recovery. Previously known algorithms achieving comparable guarantees required quasi-polynomial time. For the latter, we design an -differentially private algorithm that recovers the centers of the -mixture when the minimum separation is at least . For all choices of , this algorithm requires sample complexity and time complexity . Prior work required minimum separation at least as well as an explicit upper bound on the Euclidean norm of the centers.
View on arXiv