Differentially Private Ordinary Least Squares: -Values, Confidence Intervals and Rejecting Null-Hypotheses

Linear regression is one of the most prevalent techniques in data analysis. Given a large collection of samples composed of features and a label , linear regression is used to find the best prediction of the label as a linear combination of the features. However, it is also common to use linear regression for its \emph{explanatory} capabilities rather than label prediction. Ordinary Least Squares (OLS) is often used in statistics to establish a correlation between an attribute (e.g. gender) and a label (e.g. income) in the presence of other features. Under the assumption of a certain random generative model for the data, OLS derives \emph{-values} --- representing the likelihood of each real value to be the true correlation in the underlying distribution. Using -values, OLS can release a \emph{confidence interval} that is likely to contain the true correlation. When this interval does not intersect the origin, we can \emph{reject the null hypothesis} as it is likely that indeed has a non-zero correlation with . Our work aims at achieving similar guarantees on data under differentially private estimators. We use the Gaussian Johnson-Lindenstrauss transform, which has been shown to satisfy differential privacy if the given data has large singular values. We analyze the result of projecting the data using JLT under the OLS model and derive approximated -values, confidence intervals and bound the number of samples needed to reject the null hypothesis when the data is drawn i.i.d from a multivariate Gaussian. When not all singular values of the data are sufficiently large, we increase the singular values, thus our projected data yields an approximation for the Ridge Regression problem. We derive, under certain conditions, confidence intervals in this case as well. We also derive confidence intervals for the "Analyze Gauss" algorithm of Dwork et al.
View on arXiv