Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2012.05578
Cited By
Large-Scale Generative Data-Free Distillation
10 December 2020
Liangchen Luo
Mark Sandler
Zi Lin
A. Zhmoginov
Andrew G. Howard
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Large-Scale Generative Data-Free Distillation"
9 / 9 papers shown
Title
DKDM: Data-Free Knowledge Distillation for Diffusion Models with Any Architecture
Qianlong Xiang
Miao Zhang
Yuzhang Shang
Jianlong Wu
Yan Yan
Liqiang Nie
DiffM
60
10
0
05 Sep 2024
Sampling to Distill: Knowledge Transfer from Open-World Data
Yuzheng Wang
Zhaoyu Chen
Jie M. Zhang
Dingkang Yang
Zuhao Ge
Yang Liu
Siao Liu
Yunquan Sun
Wenqiang Zhang
Lizhe Qi
26
9
0
31 Jul 2023
Image Captions are Natural Prompts for Text-to-Image Models
Shiye Lei
Hao Chen
Senyang Zhang
Bo-Lu Zhao
Dacheng Tao
VLM
24
19
0
17 Jul 2023
Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?
Zheng Li
Yuxuan Li
Penghai Zhao
Renjie Song
Xiang Li
Jian Yang
29
19
0
22 May 2023
Momentum Adversarial Distillation: Handling Large Distribution Shifts in Data-Free Knowledge Distillation
Kien Do
Hung Le
D. Nguyen
Dang Nguyen
Haripriya Harikumar
T. Tran
Santu Rana
Svetha Venkatesh
18
32
0
21 Sep 2022
Few-Shot Unlearning by Model Inversion
Youngsik Yoon
Jinhwan Nam
Hyojeong Yun
Jaeho Lee
Dongwoo Kim
Jungseul Ok
MU
20
17
0
31 May 2022
Synthesizing Informative Training Samples with GAN
Bo-Lu Zhao
Hakan Bilen
DD
21
74
0
15 Apr 2022
Preventing Catastrophic Forgetting and Distribution Mismatch in Knowledge Distillation via Synthetic Data
Kuluhan Binici
N. Pham
T. Mitra
K. Leman
17
40
0
11 Aug 2021
Always Be Dreaming: A New Approach for Data-Free Class-Incremental Learning
James Smith
Yen-Chang Hsu
John C. Balloch
Yilin Shen
Hongxia Jin
Z. Kira
CLL
46
161
0
17 Jun 2021
1