ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2303.00738
22
32

What Are the Chances? Explaining the Epsilon Parameter in Differential Privacy

1 March 2023
Priyanka Nanayakkara
Mary Anne Smart
Rachel Cummings
Gabriel Kaptchuk
Elissa M. Redmiles
ArXivPDFHTML
Abstract

Differential privacy (DP) is a mathematical privacy notion increasingly deployed across government and industry. With DP, privacy protections are probabilistic: they are bounded by the privacy budget parameter, ϵ\epsilonϵ. Prior work in health and computational science finds that people struggle to reason about probabilistic risks. Yet, communicating the implications of ϵ\epsilonϵ to people contributing their data is vital to avoiding privacy theater -- presenting meaningless privacy protection as meaningful -- and empowering more informed data-sharing decisions. Drawing on best practices in risk communication and usability, we develop three methods to convey probabilistic DP guarantees to end users: two that communicate odds and one offering concrete examples of DP outputs. We quantitatively evaluate these explanation methods in a vignette survey study (n=963n=963n=963) via three metrics: objective risk comprehension, subjective privacy understanding of DP guarantees, and self-efficacy. We find that odds-based explanation methods are more effective than (1) output-based methods and (2) state-of-the-art approaches that gloss over information about ϵ\epsilonϵ. Further, when offered information about ϵ\epsilonϵ, respondents are more willing to share their data than when presented with a state-of-the-art DP explanation; this willingness to share is sensitive to ϵ\epsilonϵ values: as privacy protections weaken, respondents are less likely to share data.

View on arXiv
Comments on this paper