ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2110.06255
  4. Cited By
Not all noise is accounted equally: How differentially private learning
  benefits from large sampling rates

Not all noise is accounted equally: How differentially private learning benefits from large sampling rates

12 October 2021
Friedrich Dörmann
Osvald Frisk
L. Andersen
Christian Fischer Pedersen
    FedML
ArXivPDFHTML

Papers citing "Not all noise is accounted equally: How differentially private learning benefits from large sampling rates"

3 / 3 papers shown
Title
Individual Privacy Accounting via a Renyi Filter
Individual Privacy Accounting via a Renyi Filter
Vitaly Feldman
Tijana Zrnic
36
77
0
25 Aug 2020
Tempered Sigmoid Activations for Deep Learning with Differential Privacy
Tempered Sigmoid Activations for Deep Learning with Differential Privacy
Nicolas Papernot
Abhradeep Thakurta
Shuang Song
Steve Chien
Ulfar Erlingsson
AAML
104
155
0
28 Jul 2020
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp
  Minima
On Large-Batch Training for Deep Learning: Generalization Gap and Sharp Minima
N. Keskar
Dheevatsa Mudigere
J. Nocedal
M. Smelyanskiy
P. T. P. Tang
ODL
251
2,696
0
15 Sep 2016
1