Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2112.06253
Cited By
Up to 100
×
\times
×
Faster Data-free Knowledge Distillation
12 December 2021
Gongfan Fang
Kanya Mo
Xinchao Wang
Jie Song
Shitao Bei
Haofei Zhang
Mingli Song
DD
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Up to 100$\times$ Faster Data-free Knowledge Distillation"
4 / 4 papers shown
Title
DHBE: Data-free Holistic Backdoor Erasing in Deep Neural Networks via Restricted Adversarial Distillation
Zhicong Yan
Shenghong Li
Ruijie Zhao
Yuan Tian
Yuanyuan Zhao
AAML
36
11
0
13 Jun 2023
Data-Free Adversarial Knowledge Distillation for Graph Neural Networks
Yu-Lin Zhuang
Lingjuan Lyu
Chuan Shi
Carl Yang
Lichao Sun
27
16
0
08 May 2022
Distilling Knowledge from Graph Convolutional Networks
Yiding Yang
Jiayan Qiu
Mingli Song
Dacheng Tao
Xinchao Wang
157
226
0
23 Mar 2020
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Chelsea Finn
Pieter Abbeel
Sergey Levine
OOD
323
11,681
0
09 Mar 2017
1