Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2103.01396
Cited By
v1
v2 (latest)
DeepReDuce: ReLU Reduction for Fast Private Inference
International Conference on Machine Learning (ICML), 2021
2 March 2021
N. Jha
Zahra Ghodsi
S. Garg
Brandon Reagen
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"DeepReDuce: ReLU Reduction for Fast Private Inference"
4 / 54 papers shown
Sisyphus: A Cautionary Tale of Using Low-Degree Polynomial Activations in Privacy-Preserving Deep Learning
Karthik Garimella
N. Jha
Brandon Reagen
160
24
0
26 Jul 2021
Sphynx: ReLU-Efficient Network Design for Private Inference
IEEE Security and Privacy (IEEE S&P), 2021
Minsu Cho
Zahra Ghodsi
Brandon Reagen
S. Garg
Chinmay Hegde
150
29
0
17 Jun 2021
Layer Folding: Neural Network Depth Reduction using Activation Linearization
Amir Ben Dror
Niv Zehngut
Avraham Raviv
E. Artyomov
Ran Vitek
R. Jevnisek
187
23
0
17 Jun 2021
Circa: Stochastic ReLUs for Private Deep Learning
Zahra Ghodsi
N. Jha
Brandon Reagen
S. Garg
137
38
0
15 Jun 2021
Previous
1
2