Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2201.03019
Cited By
Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo Replay
9 January 2022
Kuluhan Binici
Shivam Aggarwal
N. Pham
K. Leman
T. Mitra
TTA
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Robust and Resource-Efficient Data-Free Knowledge Distillation by Generative Pseudo Replay"
6 / 6 papers shown
Title
DKDM: Data-Free Knowledge Distillation for Diffusion Models with Any Architecture
Qianlong Xiang
Miao Zhang
Yuzhang Shang
Jianlong Wu
Yan Yan
Liqiang Nie
DiffM
63
10
0
05 Sep 2024
Towards Synchronous Memorizability and Generalizability with Site-Modulated Diffusion Replay for Cross-Site Continual Segmentation
Dunyuan Xu
Xi Wang
Jingyang Zhang
Pheng-Ann Heng
MedIm
CLL
104
0
0
26 Jun 2024
Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?
Zheng Li
Yuxuan Li
Penghai Zhao
Renjie Song
Xiang Li
Jian Yang
29
19
0
22 May 2023
Momentum Adversarial Distillation: Handling Large Distribution Shifts in Data-Free Knowledge Distillation
Kien Do
Hung Le
D. Nguyen
Dang Nguyen
Haripriya Harikumar
T. Tran
Santu Rana
Svetha Venkatesh
18
32
0
21 Sep 2022
Dynamic Data-Free Knowledge Distillation by Easy-to-Hard Learning Strategy
Jingru Li
Sheng Zhou
Liangcheng Li
Haishuai Wang
Zhi Yu
Jiajun Bu
21
14
0
29 Aug 2022
AdaptCL: Adaptive Continual Learning for Tackling Heterogeneity in Sequential Datasets
Yuqing Zhao
Divya Saxena
Jiannong Cao
CLL
13
12
0
22 Jul 2022
1