ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2008.09246
  4. Cited By
A(DP)$^2$SGD: Asynchronous Decentralized Parallel Stochastic Gradient
  Descent with Differential Privacy

A(DP)2^22SGD: Asynchronous Decentralized Parallel Stochastic Gradient Descent with Differential Privacy

21 August 2020
Jie Xu
Wei Zhang
Fei Wang
    FedML
ArXivPDFHTML

Papers citing "A(DP)$^2$SGD: Asynchronous Decentralized Parallel Stochastic Gradient Descent with Differential Privacy"

4 / 4 papers shown
Title
An Efficient DP-SGD Mechanism for Large Scale NLP Models
An Efficient DP-SGD Mechanism for Large Scale NLP Models
Christophe Dupuy
Radhika Arava
Rahul Gupta
Anna Rumshisky
SyDa
26
35
0
14 Jul 2021
Hogwild! over Distributed Local Data Sets with Linearly Increasing
  Mini-Batch Sizes
Hogwild! over Distributed Local Data Sets with Linearly Increasing Mini-Batch Sizes
Marten van Dijk
Nhuong V. Nguyen
Toan N. Nguyen
Lam M. Nguyen
Quoc Tran-Dinh
Phuong Ha Nguyen
FedML
42
10
0
27 Oct 2020
LEASGD: an Efficient and Privacy-Preserving Decentralized Algorithm for
  Distributed Learning
LEASGD: an Efficient and Privacy-Preserving Decentralized Algorithm for Distributed Learning
Hsin-Pai Cheng
P. Yu
Haojing Hu
Feng Yan
Shiyu Li
Hai Helen Li
Yiran Chen
FedML
32
23
0
27 Nov 2018
Aggregated Residual Transformations for Deep Neural Networks
Aggregated Residual Transformations for Deep Neural Networks
Saining Xie
Ross B. Girshick
Piotr Dollár
Zhuowen Tu
Kaiming He
306
10,233
0
16 Nov 2016
1