ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2208.07353
59
17
v1v2 (latest)

Easy Differentially Private Linear Regression

15 August 2022
Kareem Amin
Matthew Joseph
Mónica Ribero
Sergei Vassilvitskii
    FedML
ArXiv (abs)PDFHTML
Abstract

Linear regression is a fundamental tool for statistical analysis. This has motivated the development of linear regression methods that also satisfy differential privacy and thus guarantee that the learned model reveals little about any one data point used to construct it. However, existing differentially private solutions assume that the end user can easily specify good data bounds and hyperparameters. Both present significant practical obstacles. In this paper, we study an algorithm which uses the exponential mechanism to select a model with high Tukey depth from a collection of non-private regression models. Given nnn samples of ddd-dimensional data used to train mmm models, we construct an efficient analogue using an approximate Tukey depth that runs in time O(d2n+dmlog⁡(m))O(d^2n + dm\log(m))O(d2n+dmlog(m)). We find that this algorithm obtains strong empirical performance in the data-rich setting with no data bounds or hyperparameter selection required.

View on arXiv
Comments on this paper