ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2011.09113
  4. Cited By
Effectiveness of Arbitrary Transfer Sets for Data-free Knowledge
  Distillation

Effectiveness of Arbitrary Transfer Sets for Data-free Knowledge Distillation

18 November 2020
Gaurav Kumar Nayak
Konda Reddy Mopuri
Anirban Chakraborty
ArXivPDFHTML

Papers citing "Effectiveness of Arbitrary Transfer Sets for Data-free Knowledge Distillation"

4 / 4 papers shown
Title
Scalable Collaborative Learning via Representation Sharing
Scalable Collaborative Learning via Representation Sharing
Frédéric Berdoz
Abhishek Singh
Martin Jaggi
Ramesh Raskar
FedML
22
3
0
20 Nov 2022
Momentum Adversarial Distillation: Handling Large Distribution Shifts in
  Data-Free Knowledge Distillation
Momentum Adversarial Distillation: Handling Large Distribution Shifts in Data-Free Knowledge Distillation
Kien Do
Hung Le
D. Nguyen
Dang Nguyen
Haripriya Harikumar
T. Tran
Santu Rana
Svetha Venkatesh
18
32
0
21 Sep 2022
Preventing Catastrophic Forgetting and Distribution Mismatch in
  Knowledge Distillation via Synthetic Data
Preventing Catastrophic Forgetting and Distribution Mismatch in Knowledge Distillation via Synthetic Data
Kuluhan Binici
N. Pham
T. Mitra
K. Leman
20
40
0
11 Aug 2021
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,835
0
09 Jun 2020
1