ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2111.12786
21
6

Differentially Private Nonparametric Regression Under a Growth Condition

24 November 2021
Noah Golowich
ArXivPDFHTML
Abstract

Given a real-valued hypothesis class H\mathcal{H}H, we investigate under what conditions there is a differentially private algorithm which learns an optimal hypothesis from H\mathcal{H}H given i.i.d. data. Inspired by recent results for the related setting of binary classification (Alon et al., 2019; Bun et al., 2020), where it was shown that online learnability of a binary class is necessary and sufficient for its private learnability, Jung et al. (2020) showed that in the setting of regression, online learnability of H\mathcal{H}H is necessary for private learnability. Here online learnability of H\mathcal{H}H is characterized by the finiteness of its η\etaη-sequential fat shattering dimension, sfatη(H){\rm sfat}_\eta(\mathcal{H})sfatη​(H), for all η>0\eta > 0η>0. In terms of sufficient conditions for private learnability, Jung et al. (2020) showed that H\mathcal{H}H is privately learnable if lim⁡η↓0sfatη(H)\lim_{\eta \downarrow 0} {\rm sfat}_\eta(\mathcal{H})limη↓0​sfatη​(H) is finite, which is a fairly restrictive condition. We show that under the relaxed condition lim⁡inf⁡η↓0η⋅sfatη(H)=0\lim \inf_{\eta \downarrow 0} \eta \cdot {\rm sfat}_\eta(\mathcal{H}) = 0liminfη↓0​η⋅sfatη​(H)=0, H\mathcal{H}H is privately learnable, establishing the first nonparametric private learnability guarantee for classes H\mathcal{H}H with sfatη(H){\rm sfat}_\eta(\mathcal{H})sfatη​(H) diverging as η↓0\eta \downarrow 0η↓0. Our techniques involve a novel filtering procedure to output stable hypotheses for nonparametric function classes.

View on arXiv
Comments on this paper