Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2209.10359
Cited By
Momentum Adversarial Distillation: Handling Large Distribution Shifts in Data-Free Knowledge Distillation
21 September 2022
Kien Do
Hung Le
D. Nguyen
Dang Nguyen
Haripriya Harikumar
T. Tran
Santu Rana
Svetha Venkatesh
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Momentum Adversarial Distillation: Handling Large Distribution Shifts in Data-Free Knowledge Distillation"
23 / 23 papers shown
Title
CAE-DFKD: Bridging the Transferability Gap in Data-Free Knowledge Distillation
Zherui Zhang
Changwei Wang
Rongtao Xu
W. Xu
Shibiao Xu
Yu Zhang
Li Guo
39
1
0
30 Apr 2025
Adversarial Curriculum Graph-Free Knowledge Distillation for Graph Neural Networks
Yuang Jia
Xiaojuan Shan
Jun-Xiong Xia
Guancheng Wan
Y. Zhang
Wenke Huang
Mang Ye
Stan Z. Li
45
0
0
01 Apr 2025
FedMHO: Heterogeneous One-Shot Federated Learning Towards Resource-Constrained Edge Devices
Dezhong Yao
Yuexin Shi
Tongtong Liu
Zhiqiang Xu
61
1
0
12 Feb 2025
Toward Efficient Data-Free Unlearning
Chenhao Zhang
Shaofei Shen
Weitong Chen
Miao Xu
MU
66
0
0
18 Dec 2024
Large-Scale Data-Free Knowledge Distillation for ImageNet via Multi-Resolution Data Generation
Minh-Tuan Tran
Trung Le
Xuan-May Le
Jianfei Cai
Mehrtash Harandi
Dinh Q. Phung
73
2
0
26 Nov 2024
DFDG: Data-Free Dual-Generator Adversarial Distillation for One-Shot Federated Learning
Kangyang Luo
Shuai Wang
Y. Fu
Renrong Shao
Xiang Li
Yunshi Lan
Ming Gao
Jinlong Shu
FedML
25
2
0
12 Sep 2024
Small Scale Data-Free Knowledge Distillation
He Liu
Yikai Wang
Huaping Liu
Fuchun Sun
Anbang Yao
19
8
0
12 Jun 2024
Mind the Gap Between Synthetic and Real: Utilizing Transfer Learning to Probe the Boundaries of Stable Diffusion Generated Data
Leonhard Hennicke
C. Adriano
Holger Giese
Jan Mathias Koehler
Lukas Schott
DiffM
45
2
0
06 May 2024
Why does Knowledge Distillation Work? Rethink its Attention and Fidelity Mechanism
Chenqi Guo
Shiwei Zhong
Xiaofeng Liu
Qianli Feng
Yinglong Ma
22
1
0
30 Apr 2024
Data-free Knowledge Distillation for Fine-grained Visual Categorization
Renrong Shao
Wei Zhang
Jianhua Yin
Jun Wang
31
2
0
18 Apr 2024
Weight Copy and Low-Rank Adaptation for Few-Shot Distillation of Vision Transformers
Diana-Nicoleta Grigore
Mariana-Iuliana Georgescu
J. A. Justo
T. Johansen
Andreea-Iuliana Ionescu
Radu Tudor Ionescu
26
0
0
14 Apr 2024
De-confounded Data-free Knowledge Distillation for Handling Distribution Shifts
Yuzheng Wang
Dingkang Yang
Zhaoyu Chen
Yang Liu
Siao Liu
Wenqiang Zhang
Lihua Zhang
Lizhe Qi
32
6
0
28 Mar 2024
AuG-KD: Anchor-Based Mixup Generation for Out-of-Domain Knowledge Distillation
Zihao Tang
Zheqi Lv
Shengyu Zhang
Yifan Zhou
Xinyu Duan
Fei Wu
Kun Kuang
27
1
0
11 Mar 2024
Teacher as a Lenient Expert: Teacher-Agnostic Data-Free Knowledge Distillation
Hyunjune Shin
Dong-Wan Choi
AAML
19
1
0
18 Feb 2024
Direct Distillation between Different Domains
Jialiang Tang
Shuo Chen
Gang Niu
Hongyuan Zhu
Joey Tianyi Zhou
Chen Gong
Masashi Sugiyama
47
3
0
12 Jan 2024
Data-Free Hard-Label Robustness Stealing Attack
Xiaojian Yuan
Kejiang Chen
Wen Huang
Jie Zhang
Weiming Zhang
Neng H. Yu
AAML
10
5
0
10 Dec 2023
Task-Distributionally Robust Data-Free Meta-Learning
Zixuan Hu
Li Shen
Zhenyi Wang
Yongxian Wei
Baoyuan Wu
Chun Yuan
Dacheng Tao
OOD
20
0
0
23 Nov 2023
NAYER: Noisy Layer Data Generation for Efficient and Effective Data-free Knowledge Distillation
Minh-Tuan Tran
Trung Le
Xuan-May Le
Mehrtash Harandi
Quan Hung Tran
Dinh Q. Phung
10
11
0
30 Sep 2023
DFRD: Data-Free Robustness Distillation for Heterogeneous Federated Learning
Kangyang Luo
Shuai Wang
Y. Fu
Xiang Li
Yunshi Lan
Minghui Gao
FedML
21
23
0
24 Sep 2023
Sampling to Distill: Knowledge Transfer from Open-World Data
Yuzheng Wang
Zhaoyu Chen
Jie M. Zhang
Dingkang Yang
Zuhao Ge
Yang Liu
Siao Liu
Yunquan Sun
Wenqiang Zhang
Lizhe Qi
26
9
0
31 Jul 2023
Distribution Shift Matters for Knowledge Distillation with Webly Collected Images
Jialiang Tang
Shuo Chen
Gang Niu
Masashi Sugiyama
Chenggui Gong
13
13
0
21 Jul 2023
A Comprehensive Survey of Forgetting in Deep Learning Beyond Continual Learning
Zhenyi Wang
Enneng Yang
Li Shen
Heng-Chiao Huang
KELM
MU
29
46
0
16 Jul 2023
Learning to Retain while Acquiring: Combating Distribution-Shift in Adversarial Data-Free Knowledge Distillation
Gaurav Patel
Konda Reddy Mopuri
Qiang Qiu
16
28
0
28 Feb 2023
1