ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2103.01396
  4. Cited By
DeepReDuce: ReLU Reduction for Fast Private Inference
v1v2 (latest)

DeepReDuce: ReLU Reduction for Fast Private Inference

International Conference on Machine Learning (ICML), 2021
2 March 2021
N. Jha
Zahra Ghodsi
S. Garg
Brandon Reagen
ArXiv (abs)PDFHTML

Papers citing "DeepReDuce: ReLU Reduction for Fast Private Inference"

4 / 54 papers shown
Sisyphus: A Cautionary Tale of Using Low-Degree Polynomial Activations
  in Privacy-Preserving Deep Learning
Sisyphus: A Cautionary Tale of Using Low-Degree Polynomial Activations in Privacy-Preserving Deep Learning
Karthik Garimella
N. Jha
Brandon Reagen
160
24
0
26 Jul 2021
Sphynx: ReLU-Efficient Network Design for Private Inference
Sphynx: ReLU-Efficient Network Design for Private InferenceIEEE Security and Privacy (IEEE S&P), 2021
Minsu Cho
Zahra Ghodsi
Brandon Reagen
S. Garg
Chinmay Hegde
150
29
0
17 Jun 2021
Layer Folding: Neural Network Depth Reduction using Activation
  Linearization
Layer Folding: Neural Network Depth Reduction using Activation Linearization
Amir Ben Dror
Niv Zehngut
Avraham Raviv
E. Artyomov
Ran Vitek
R. Jevnisek
187
23
0
17 Jun 2021
Circa: Stochastic ReLUs for Private Deep Learning
Circa: Stochastic ReLUs for Private Deep Learning
Zahra Ghodsi
N. Jha
Brandon Reagen
S. Garg
137
38
0
15 Jun 2021
Previous
12