Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2003.09088
Cited By
Data-Free Knowledge Amalgamation via Group-Stack Dual-GAN
20 March 2020
Jingwen Ye
Yixin Ji
Xinchao Wang
Xin Gao
Mingli Song
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Data-Free Knowledge Amalgamation via Group-Stack Dual-GAN"
7 / 7 papers shown
Title
Sampling to Distill: Knowledge Transfer from Open-World Data
Yuzheng Wang
Zhaoyu Chen
Jie M. Zhang
Dingkang Yang
Zuhao Ge
Yang Liu
Siao Liu
Yunquan Sun
Wenqiang Zhang
Lizhe Qi
26
9
0
31 Jul 2023
FoPro-KD: Fourier Prompted Effective Knowledge Distillation for Long-Tailed Medical Image Recognition
Marawan Elbatel
Robert Martí
X. Li
AAML
30
10
0
27 May 2023
TinyMIM: An Empirical Study of Distilling MIM Pre-trained Models
Sucheng Ren
Fangyun Wei
Zheng-Wei Zhang
Han Hu
22
34
0
03 Jan 2023
IDEAL: Query-Efficient Data-Free Learning from Black-box Models
Jie M. Zhang
Chen Chen
Lingjuan Lyu
55
14
0
23 May 2022
The Augmented Image Prior: Distilling 1000 Classes by Extrapolating from a Single Image
Yuki M. Asano
Aaqib Saeed
19
7
0
01 Dec 2021
Learning Propagation Rules for Attribution Map Generation
Yiding Yang
Jiayan Qiu
Mingli Song
Dacheng Tao
Xinchao Wang
FAtt
19
17
0
14 Oct 2020
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
17
2,822
0
09 Jun 2020
1