Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2001.11612
Cited By
Search for Better Students to Learn Distilled Knowledge
30 January 2020
Jindong Gu
Volker Tresp
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Search for Better Students to Learn Distilled Knowledge"
6 / 6 papers shown
Title
Explainability and Robustness of Deep Visual Classification Models
Jindong Gu
AAML
39
2
0
03 Jan 2023
Design Automation for Fast, Lightweight, and Effective Deep Learning Models: A Survey
Dalin Zhang
Kaixuan Chen
Yan Zhao
B. Yang
Li-Ping Yao
Christian S. Jensen
43
3
0
22 Aug 2022
Simple Distillation Baselines for Improving Small Self-supervised Models
Jindong Gu
Wei Liu
Yonglong Tian
19
8
0
21 Jun 2021
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,835
0
09 Jun 2020
ResKD: Residual-Guided Knowledge Distillation
Xuewei Li
Songyuan Li
Bourahla Omar
Fei Wu
Xi Li
21
47
0
08 Jun 2020
Neural Architecture Search with Reinforcement Learning
Barret Zoph
Quoc V. Le
264
5,326
0
05 Nov 2016
1