ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2105.08584
  4. Cited By
Contrastive Model Inversion for Data-Free Knowledge Distillation

Contrastive Model Inversion for Data-Free Knowledge Distillation

18 May 2021
Gongfan Fang
Mingli Song
Xinchao Wang
Chen Shen
Xingen Wang
Xiuming Zhang
ArXiv (abs)PDFHTMLGithub (72★)

Papers citing "Contrastive Model Inversion for Data-Free Knowledge Distillation"

49 / 49 papers shown
Title
Simple Semi-supervised Knowledge Distillation from Vision-Language Models via $\mathbf{\texttt{D}}$ual-$\mathbf{\texttt{H}}$ead $\mathbf{\texttt{O}}$ptimization
Simple Semi-supervised Knowledge Distillation from Vision-Language Models via D\mathbf{\texttt{D}}Dual-H\mathbf{\texttt{H}}Head O\mathbf{\texttt{O}}Optimization
Seongjae Kang
Dong Bok Lee
Hyungjoon Jang
Sung Ju Hwang
VLM
112
1
0
12 May 2025
CAE-DFKD: Bridging the Transferability Gap in Data-Free Knowledge Distillation
CAE-DFKD: Bridging the Transferability Gap in Data-Free Knowledge Distillation
Zherui Zhang
Changwei Wang
Rongtao Xu
Wenyuan Xu
Shibiao Xu
Yu Zhang
Li Guo
108
1
0
30 Apr 2025
Adversarial Curriculum Graph-Free Knowledge Distillation for Graph Neural Networks
Adversarial Curriculum Graph-Free Knowledge Distillation for Graph Neural Networks
Yuang Jia
Xiaojuan Shan
Jun Xia
Guancheng Wan
Y. Zhang
Wenke Huang
Mang Ye
Stan Z. Li
124
0
0
01 Apr 2025
Data-free Knowledge Distillation with Diffusion Models
Data-free Knowledge Distillation with Diffusion Models
Xiaohua Qi
Renda Li
Long Peng
Q. Ling
Jun Yu
Ziyi Chen
Peng Chang
Mei Han
Jing Xiao
103
0
0
01 Apr 2025
Toward Efficient Data-Free Unlearning
Toward Efficient Data-Free Unlearning
Chenhao Zhang
Shaofei Shen
Weitong Chen
Miao Xu
MU
158
0
0
18 Dec 2024
Hybrid Data-Free Knowledge Distillation
Hybrid Data-Free Knowledge Distillation
Jialiang Tang
Shuo Chen
Chen Gong
DD
104
0
0
18 Dec 2024
Relation-Guided Adversarial Learning for Data-free Knowledge Transfer
Relation-Guided Adversarial Learning for Data-free Knowledge Transfer
Yingping Liang
Ying Fu
105
1
0
16 Dec 2024
Large-Scale Data-Free Knowledge Distillation for ImageNet via
  Multi-Resolution Data Generation
Large-Scale Data-Free Knowledge Distillation for ImageNet via Multi-Resolution Data Generation
Minh-Tuan Tran
Trung Le
Xuan-May Le
Jianfei Cai
Mehrtash Harandi
Dinh Q. Phung
147
2
0
26 Nov 2024
Efficient and Effective Model Extraction
Efficient and Effective Model Extraction
Hongyu Zhu
Wentao Hu
Sichu Liang
Fangqi Li
Wenwen Wang
Shilin Wang
46
0
0
21 Sep 2024
DFDG: Data-Free Dual-Generator Adversarial Distillation for One-Shot
  Federated Learning
DFDG: Data-Free Dual-Generator Adversarial Distillation for One-Shot Federated Learning
Kangyang Luo
Shuai Wang
Y. Fu
Renrong Shao
Xiang Li
Yunshi Lan
Ming Gao
Jinlong Shu
FedML
113
3
0
12 Sep 2024
DKDM: Data-Free Knowledge Distillation for Diffusion Models with Any Architecture
DKDM: Data-Free Knowledge Distillation for Diffusion Models with Any Architecture
Qianlong Xiang
Miao Zhang
Yuzhang Shang
Jianlong Wu
Yan Yan
Liqiang Nie
DiffM
125
10
0
05 Sep 2024
Condensed Sample-Guided Model Inversion for Knowledge Distillation
Condensed Sample-Guided Model Inversion for Knowledge Distillation
Kuluhan Binici
Shivam Aggarwal
Cihan Acar
N. Pham
K. Leman
Gim Hee Lee
Tulika Mitra
86
1
0
25 Aug 2024
Optimizing Vision Transformers with Data-Free Knowledge Transfer
Optimizing Vision Transformers with Data-Free Knowledge Transfer
Gousia Habib
Damandeep Singh
I. Malik
Brejesh Lall
70
1
0
12 Aug 2024
Distilling Vision-Language Foundation Models: A Data-Free Approach via
  Prompt Diversification
Distilling Vision-Language Foundation Models: A Data-Free Approach via Prompt Diversification
Yunyi Xuan
Weijie Chen
Shicai Yang
Di Xie
Luojun Lin
Yueting Zhuang
VLM
109
4
0
21 Jul 2024
Encapsulating Knowledge in One Prompt
Encapsulating Knowledge in One Prompt
Qi Li
Runpeng Yu
Xinchao Wang
VLMKELM
76
3
0
16 Jul 2024
Small Scale Data-Free Knowledge Distillation
Small Scale Data-Free Knowledge Distillation
He Liu
Yikai Wang
Huaping Liu
Fuchun Sun
Anbang Yao
75
10
0
12 Jun 2024
Data-free Knowledge Distillation for Fine-grained Visual Categorization
Data-free Knowledge Distillation for Fine-grained Visual Categorization
Renrong Shao
Wei Zhang
Jianhua Yin
Jun Wang
61
3
0
18 Apr 2024
De-confounded Data-free Knowledge Distillation for Handling Distribution
  Shifts
De-confounded Data-free Knowledge Distillation for Handling Distribution Shifts
Yuzheng Wang
Dingkang Yang
Zhaoyu Chen
Yang Liu
Siao Liu
Wenqiang Zhang
Lihua Zhang
Lizhe Qi
73
9
0
28 Mar 2024
Learning to Project for Cross-Task Knowledge Distillation
Learning to Project for Cross-Task Knowledge Distillation
Dylan Auty
Roy Miles
Benedikt Kolbeinsson
K. Mikolajczyk
85
0
0
21 Mar 2024
Text-Enhanced Data-free Approach for Federated Class-Incremental
  Learning
Text-Enhanced Data-free Approach for Federated Class-Incremental Learning
Minh-Tuan Tran
Trung Le
Xuan-May Le
Mehrtash Harandi
Dinh Q. Phung
CLL
112
10
0
21 Mar 2024
$V_kD:$ Improving Knowledge Distillation using Orthogonal Projections
VkD:V_kD:Vk​D: Improving Knowledge Distillation using Orthogonal Projections
Roy Miles
Ismail Elezi
Jiankang Deng
112
10
0
10 Mar 2024
Data-Free Hard-Label Robustness Stealing Attack
Data-Free Hard-Label Robustness Stealing Attack
Xiaojian Yuan
Kejiang Chen
Wen Huang
Jie Zhang
Weiming Zhang
Neng H. Yu
AAML
63
5
0
10 Dec 2023
Task-Distributionally Robust Data-Free Meta-Learning
Task-Distributionally Robust Data-Free Meta-Learning
Zixuan Hu
Li Shen
Zhenyi Wang
Yongxian Wei
Baoyuan Wu
Chun Yuan
Dacheng Tao
OOD
63
0
0
23 Nov 2023
Data-Free Knowledge Distillation Using Adversarially Perturbed OpenGL
  Shader Images
Data-Free Knowledge Distillation Using Adversarially Perturbed OpenGL Shader Images
Logan Frank
Jim Davis
77
1
0
20 Oct 2023
NAYER: Noisy Layer Data Generation for Efficient and Effective Data-free
  Knowledge Distillation
NAYER: Noisy Layer Data Generation for Efficient and Effective Data-free Knowledge Distillation
Minh-Tuan Tran
Trung Le
Xuan-May Le
Mehrtash Harandi
Quan Hung Tran
Dinh Q. Phung
84
13
0
30 Sep 2023
Towards Few-Call Model Stealing via Active Self-Paced Knowledge Distillation and Diffusion-Based Image Generation
Towards Few-Call Model Stealing via Active Self-Paced Knowledge Distillation and Diffusion-Based Image Generation
Vlad Hondru
Radu Tudor Ionescu
DiffM
106
2
0
29 Sep 2023
DFRD: Data-Free Robustness Distillation for Heterogeneous Federated
  Learning
DFRD: Data-Free Robustness Distillation for Heterogeneous Federated Learning
Kangyang Luo
Shuai Wang
Y. Fu
Xiang Li
Yunshi Lan
Minghui Gao
FedML
91
29
0
24 Sep 2023
Sampling to Distill: Knowledge Transfer from Open-World Data
Sampling to Distill: Knowledge Transfer from Open-World Data
Yuzheng Wang
Zhaoyu Chen
Jie M. Zhang
Dingkang Yang
Zuhao Ge
Yang Liu
Siao Liu
Yunquan Sun
Wenqiang Zhang
Lizhe Qi
85
9
0
31 Jul 2023
Distribution Shift Matters for Knowledge Distillation with Webly
  Collected Images
Distribution Shift Matters for Knowledge Distillation with Webly Collected Images
Jialiang Tang
Shuo Chen
Gang Niu
Masashi Sugiyama
Chenggui Gong
74
14
0
21 Jul 2023
Customizing Synthetic Data for Data-Free Student Learning
Customizing Synthetic Data for Data-Free Student Learning
Shiya Luo
Defang Chen
Can Wang
42
2
0
10 Jul 2023
Categories of Response-Based, Feature-Based, and Relation-Based
  Knowledge Distillation
Categories of Response-Based, Feature-Based, and Relation-Based Knowledge Distillation
Chuanguang Yang
Xinqiang Yu
Zhulin An
Yongjun Xu
VLMOffRL
193
26
0
19 Jun 2023
Revisiting Data-Free Knowledge Distillation with Poisoned Teachers
Revisiting Data-Free Knowledge Distillation with Poisoned Teachers
Junyuan Hong
Yi Zeng
Shuyang Yu
Lingjuan Lyu
R. Jia
Jiayu Zhou
AAML
57
9
0
04 Jun 2023
Learning to Learn from APIs: Black-Box Data-Free Meta-Learning
Learning to Learn from APIs: Black-Box Data-Free Meta-Learning
Zixuan Hu
Li Shen
Zhenyi Wang
Baoyuan Wu
Chun Yuan
Dacheng Tao
132
8
0
28 May 2023
FoPro-KD: Fourier Prompted Effective Knowledge Distillation for
  Long-Tailed Medical Image Recognition
FoPro-KD: Fourier Prompted Effective Knowledge Distillation for Long-Tailed Medical Image Recognition
Marawan Elbatel
Robert Martí
Xuelong Li
AAML
116
11
0
27 May 2023
Is Synthetic Data From Diffusion Models Ready for Knowledge
  Distillation?
Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?
Zheng Li
Yuxuan Li
Penghai Zhao
Renjie Song
Xiang Li
Jian Yang
99
20
0
22 May 2023
Out of Thin Air: Exploring Data-Free Adversarial Robustness Distillation
Out of Thin Air: Exploring Data-Free Adversarial Robustness Distillation
Yuzheng Wang
Zhaoyu Chen
Dingkang Yang
Pinxue Guo
Kaixun Jiang
Wenqiang Zhang
Lizhe Qi
AAML
67
6
0
21 Mar 2023
Architecture, Dataset and Model-Scale Agnostic Data-free Meta-Learning
Architecture, Dataset and Model-Scale Agnostic Data-free Meta-Learning
Zixuan Hu
Li Shen
Zhenyi Wang
Tongliang Liu
Chun Yuan
Dacheng Tao
135
4
0
20 Mar 2023
Data-Free Sketch-Based Image Retrieval
Data-Free Sketch-Based Image Retrieval
Abhra Chaudhuri
A. Bhunia
Yi-Zhe Song
Anjan Dutta
82
7
0
14 Mar 2023
Feature-Rich Audio Model Inversion for Data-Free Knowledge Distillation
  Towards General Sound Classification
Feature-Rich Audio Model Inversion for Data-Free Knowledge Distillation Towards General Sound Classification
Zuheng Kang
Yayun He
Jianzong Wang
Junqing Peng
Xiaoyang Qu
Jing Xiao
47
2
0
14 Mar 2023
Learning to Retain while Acquiring: Combating Distribution-Shift in
  Adversarial Data-Free Knowledge Distillation
Learning to Retain while Acquiring: Combating Distribution-Shift in Adversarial Data-Free Knowledge Distillation
Gaurav Patel
Konda Reddy Mopuri
Qiang Qiu
76
28
0
28 Feb 2023
Explicit and Implicit Knowledge Distillation via Unlabeled Data
Explicit and Implicit Knowledge Distillation via Unlabeled Data
Yuzheng Wang
Zuhao Ge
Zhaoyu Chen
Xiangjian Liu
Chuang Ma
Yunquan Sun
Lizhe Qi
110
10
0
17 Feb 2023
Momentum Adversarial Distillation: Handling Large Distribution Shifts in
  Data-Free Knowledge Distillation
Momentum Adversarial Distillation: Handling Large Distribution Shifts in Data-Free Knowledge Distillation
Kien Do
Hung Le
D. Nguyen
Dang Nguyen
Haripriya Harikumar
T. Tran
Santu Rana
Svetha Venkatesh
68
33
0
21 Sep 2022
Dynamic Data-Free Knowledge Distillation by Easy-to-Hard Learning
  Strategy
Dynamic Data-Free Knowledge Distillation by Easy-to-Hard Learning Strategy
Jingru Li
Sheng Zhou
Liangcheng Li
Haishuai Wang
Zhi Yu
Jiajun Bu
88
14
0
29 Aug 2022
CDFKD-MFS: Collaborative Data-free Knowledge Distillation via
  Multi-level Feature Sharing
CDFKD-MFS: Collaborative Data-free Knowledge Distillation via Multi-level Feature Sharing
Zhiwei Hao
Yong Luo
Zhi Wang
Han Hu
J. An
97
28
0
24 May 2022
IDEAL: Query-Efficient Data-Free Learning from Black-box Models
IDEAL: Query-Efficient Data-Free Learning from Black-box Models
Jie M. Zhang
Chen Chen
Lingjuan Lyu
125
15
0
23 May 2022
Robust and Resource-Efficient Data-Free Knowledge Distillation by
  Generative Pseudo Replay
Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo Replay
Kuluhan Binici
Shivam Aggarwal
N. Pham
K. Leman
T. Mitra
TTA
110
48
0
09 Jan 2022
Conditional Generative Data-free Knowledge Distillation
Conditional Generative Data-free Knowledge Distillation
Xinyi Yu
Ling Yan
Yang Yang
Libo Zhou
Linlin Ou
76
8
0
31 Dec 2021
Data-Free Knowledge Transfer: A Survey
Data-Free Knowledge Transfer: A Survey
Yuang Liu
Wei Zhang
Jun Wang
Jianyong Wang
119
48
0
31 Dec 2021
Up to 100$\times$ Faster Data-free Knowledge Distillation
Up to 100×\times× Faster Data-free Knowledge Distillation
Gongfan Fang
Kanya Mo
Xinchao Wang
Mingli Song
Shitao Bei
Haofei Zhang
Xiuming Zhang
DD
89
4
0
12 Dec 2021
1