Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2006.06783
Cited By
Evading Curse of Dimensionality in Unconstrained Private GLMs via Private Gradient Descent
11 June 2020
Shuang Song
Thomas Steinke
Om Thakkar
Abhradeep Thakurta
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Evading Curse of Dimensionality in Unconstrained Private GLMs via Private Gradient Descent"
8 / 8 papers shown
Title
DC-SGD: Differentially Private SGD with Dynamic Clipping through Gradient Norm Distribution Estimation
Chengkun Wei
Weixian Li
Chen Gong
Wenzhi Chen
58
0
0
29 Mar 2025
PILLAR: How to make semi-private learning more effective
Francesco Pinto
Yaxian Hu
Fanny Yang
Amartya Sanyal
42
11
0
06 Jun 2023
Near Optimal Private and Robust Linear Regression
Xiyang Liu
Prateek Jain
Weihao Kong
Sewoong Oh
A. Suggala
35
9
0
30 Jan 2023
Differentially Private Image Classification from Features
Harsh Mehta
Walid Krichene
Abhradeep Thakurta
Alexey Kurakin
Ashok Cutkosky
46
7
0
24 Nov 2022
When Does Differentially Private Learning Not Suffer in High Dimensions?
Xuechen Li
Daogao Liu
Tatsunori Hashimoto
Huseyin A. Inan
Janardhan Kulkarni
Y. Lee
Abhradeep Thakurta
25
58
0
01 Jul 2022
Toward Training at ImageNet Scale with Differential Privacy
Alexey Kurakin
Shuang Song
Steve Chien
Roxana Geambasu
Andreas Terzis
Abhradeep Thakurta
25
99
0
28 Jan 2022
Understanding Clipping for Federated Learning: Convergence and Client-Level Differential Privacy
Xinwei Zhang
Xiangyi Chen
Min-Fong Hong
Zhiwei Steven Wu
Jinfeng Yi
FedML
22
90
0
25 Jun 2021
Stochastic Gradient Descent for Non-smooth Optimization: Convergence Results and Optimal Averaging Schemes
Ohad Shamir
Tong Zhang
99
570
0
08 Dec 2012
1