Linear regression is one of the most prevalent techniques in data analysis. Given a collection of samples composed of features and a label , linear regression is used to find the best prediction of the label as a linear combination of the features. However, it is also common to use linear regression for its explanatory capabilities rather than label prediction. Ordinary Least Squares (OLS) is often used in statistics to establish a correlation between an attribute (e.g. gender) and a label (e.g. income) in the presence of other features. OLS uses linear regression in order to estimate the correlation between the label and a feature on a given dataset; and then, under the assumption of a certain generative model for the data, OLS outputs an interval that is likely to contain the correlation between and in the underlying distribution (a confidence interval). When this interval does not intersect the origin, we can reject the null hypothesis as it is likely that has a non-zero correlation with . Our work aims at achieving similar guarantees on data under differential privacy. We use the Gaussian Johnson-Lindenstrauss transform, which has been shown to satisfy differential privacy if the data has large singular values. We analyze the result of JL projection in the OLS model and show how to approximate confidence intervals using only the projected data, and bound the number of samples needed to reject the null hypothesis with i.i.d draws from a multivariate Gaussian. When not all singular values of the data are sufficiently large, we increase the input's singular values and then use the JL transform. Thus our projected data yields an approximation for the Ridge Regression problem - a variant of the linear regression that uses a -regularization term. We give conditions under which the regularized problem is still helpful in establishing correlations.
View on arXiv