ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1507.02482
87
117
v1v2v3v4 (latest)

Differentially Private Least Squares: Estimation, Confidence and Rejecting the Null Hypothesis

9 July 2015
Or Sheffet
ArXiv (abs)PDFHTML
Abstract

Linear regression is one of the most prevalent techniques in data analysis. Given a collection of samples composed of features xxx and a label yyy, linear regression is used to find the best prediction of the label as a linear combination of the features. However, it is also common to use linear regression for its explanatory capabilities rather than label prediction. Ordinary Least Squares (OLS) is often used in statistics to establish a correlation between an attribute (e.g. gender) and a label (e.g. income) in the presence of other features. OLS uses linear regression in order to estimate the correlation between the label and a feature xjx_jxj​ on a given dataset; and then, under the assumption of a certain generative model for the data, OLS outputs an interval that is likely to contain the correlation between yyy and xjx_jxj​ in the underlying distribution (a confidence interval). When this interval does not intersect the origin, we can reject the null hypothesis as it is likely that xjx_jxj​ has a non-zero correlation with yyy. Our work aims at achieving similar guarantees on data under differential privacy. We use the Gaussian Johnson-Lindenstrauss transform, which has been shown to satisfy differential privacy if the data has large singular values. We analyze the result of JL projection in the OLS model and show how to approximate confidence intervals using only the projected data, and bound the number of samples needed to reject the null hypothesis with i.i.d draws from a multivariate Gaussian. When not all singular values of the data are sufficiently large, we increase the input's singular values and then use the JL transform. Thus our projected data yields an approximation for the Ridge Regression problem - a variant of the linear regression that uses a l2l_2l2​-regularization term. We give conditions under which the regularized problem is still helpful in establishing correlations.

View on arXiv
Comments on this paper