In this paper, we study the problem of estimating the covariance matrix under differential privacy, where the underlying covariance matrix is assumed to be sparse and of high dimensions. We propose a new method, called DP-Thresholding, to achieve a non-trivial -norm based error bound, which is significantly better than the existing ones from adding noise directly to the empirical covariance matrix. We also extend the -norm based error bound to a general -norm based one for any , and show that they share the same upper bound asymptotically. Our approach can be easily extended to local differential privacy. Experiments on the synthetic datasets show consistent results with our theoretical claims.
View on arXiv