6

DP-λλCGD: Efficient Noise Correlation for Differentially Private Model Training

Nikita P. Kalinin
Ryan McKenna
Rasmus Pagh
Christoph H. Lampert
Main:8 Pages
8 Figures
Bibliography:2 Pages
5 Tables
Appendix:9 Pages
Abstract

Differentially private stochastic gradient descent (DP-SGD) is the gold standard for training machine learning models with formal differential privacy guarantees. Several recent extensions improve its accuracy by introducing correlated noise across training iterations. Matrix factorization mechanisms are a prominent example, but they correlate noise across many iterations and require storing previously added noise vectors, leading to substantial memory overhead in some settings. In this work, we propose a new noise correlation strategy that correlates noise only with the immediately preceding iteration and cancels a controlled portion of it. Our method relies on noise regeneration using a pseudorandom noise generator, eliminating the need to store past noise. As a result, it requires no additional memory beyond standard DP-SGD. We show that the computational overhead is minimal and empirically demonstrate improved accuracy over DP-SGD.

View on arXiv
Comments on this paper