Private Ad Modeling with DP-SGD
Carson E. Denison
Badih Ghazi
Pritish Kamath
Ravi Kumar
Pasin Manurangsi
Krishnagiri Narra
Amer Sinha
A. Varadarajan
Chiyuan Zhang

Abstract
A well-known algorithm in privacy-preserving ML is differentially private stochastic gradient descent (DP-SGD). While this algorithm has been evaluated on text and image data, it has not been previously applied to ads data, which are notorious for their high class imbalance and sparse gradient updates. In this work we apply DP-SGD to several ad modeling tasks including predicting click-through rates, conversion rates, and number of conversion events, and evaluate their privacy-utility trade-off on real-world datasets. Our work is the first to empirically demonstrate that DP-SGD can provide both privacy and utility for ad modeling tasks.
View on arXivComments on this paper