ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.21478
26
0

CAE-DFKD: Bridging the Transferability Gap in Data-Free Knowledge Distillation

30 April 2025
Zherui Zhang
Changwei Wang
Rongtao Xu
W. Xu
Shibiao Xu
Yu Zhang
Li Guo
ArXivPDFHTML
Abstract

Data-Free Knowledge Distillation (DFKD) enables the knowledge transfer from the given pre-trained teacher network to the target student model without access to the real training data. Existing DFKD methods focus primarily on improving image recognition performance on associated datasets, often neglecting the crucial aspect of the transferability of learned representations. In this paper, we propose Category-Aware Embedding Data-Free Knowledge Distillation (CAE-DFKD), which addresses at the embedding level the limitations of previous rely on image-level methods to improve model generalization but fail when directly applied to DFKD. The superiority and flexibility of CAE-DFKD are extensively evaluated, including: \textit{\textbf{i.)}} Significant efficiency advantages resulting from altering the generator training paradigm; \textit{\textbf{ii.)}} Competitive performance with existing DFKD state-of-the-art methods on image recognition tasks; \textit{\textbf{iii.)}} Remarkable transferability of data-free learned representations demonstrated in downstream tasks.

View on arXiv
@article{zhang2025_2504.21478,
  title={ CAE-DFKD: Bridging the Transferability Gap in Data-Free Knowledge Distillation },
  author={ Zherui Zhang and Changwei Wang and Rongtao Xu and Wenhao Xu and Shibiao Xu and Yu Zhang and Li Guo },
  journal={arXiv preprint arXiv:2504.21478},
  year={ 2025 }
}
Comments on this paper