ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1811.05072
  4. Cited By
Private Model Compression via Knowledge Distillation

Private Model Compression via Knowledge Distillation

13 November 2018
Ji Wang
Weidong Bao
Lichao Sun
Xiaomin Zhu
Bokai Cao
Philip S. Yu
    FedML
ArXiv (abs)PDFHTML

Papers citing "Private Model Compression via Knowledge Distillation"

5 / 55 papers shown
Membership Privacy for Machine Learning Models Through Knowledge
  Transfer
Membership Privacy for Machine Learning Models Through Knowledge Transfer
Virat Shejwalkar
Amir Houmansadr
173
12
0
15 Jun 2019
Divide and Conquer: Leveraging Intermediate Feature Representations for
  Quantized Training of Neural Networks
Divide and Conquer: Leveraging Intermediate Feature Representations for Quantized Training of Neural NetworksInternational Conference on Machine Learning (ICML), 2019
Ahmed T. Elthakeb
Prannoy Pilligundla
Alex Cloninger
H. Esmaeilzadeh
MQ
294
9
0
14 Jun 2019
Private Deep Learning with Teacher Ensembles
Lichao Sun
Yingbo Zhou
Ji Wang
Jia Li
R. Socher
Philip S. Yu
Caiming Xiong
FedML
82
2
0
05 Jun 2019
Copying Machine Learning Classifiers
Copying Machine Learning ClassifiersIEEE Access (IEEE Access), 2019
Irene Unceta
Jordi Nin
O. Pujol
199
19
0
05 Mar 2019
Improved Knowledge Distillation via Teacher Assistant
Improved Knowledge Distillation via Teacher AssistantAAAI Conference on Artificial Intelligence (AAAI), 2019
Seyed Iman Mirzadeh
Mehrdad Farajtabar
Ang Li
Nir Levine
Akihiro Matsukawa
H. Ghasemzadeh
384
1,280
0
09 Feb 2019
Previous
12
Page 2 of 2