Communities
Connect sessions
AI calendar
Organizations
Contact Sales
Search
Open menu
Home
Papers
2003.03622
Cited By
Explaining Knowledge Distillation by Quantifying the Knowledge
Computer Vision and Pattern Recognition (CVPR), 2025
7 March 2020
Xu Cheng
Zhefan Rao
Yilan Chen
Quanshi Zhang
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Explaining Knowledge Distillation by Quantifying the Knowledge"
10 / 60 papers shown
Title
Hands-on Guidance for Distilling Object Detectors
Yangyang Qin
H. Ling
Zhenghai He
Yuxuan Shi
Lei Wu
ObjD
FedML
64
1
0
26 Mar 2021
Joint framework with deep feature distillation and adaptive focal loss for weakly supervised audio tagging and acoustic event detection
Yunhao Liang
Yanhua Long
Yijie Li
Jiaen Liang
Yuping Wang
87
10
0
23 Mar 2021
Student Network Learning via Evolutionary Knowledge Distillation
Kangkai Zhang
Chunhui Zhang
Shikun Li
Dan Zeng
Shiming Ge
103
93
0
23 Mar 2021
Spectral Roll-off Points Variations: Exploring Useful Information in Feature Maps by Its Variations
Yunkai Yu
Yuyang You
Zhihong Yang
Guozheng Liu
Peiyao Li
Zhicheng Yang
Wenjing Shan
91
2
0
31 Jan 2021
Multi-level Knowledge Distillation via Knowledge Alignment and Correlation
Fei Ding
Yin Yang
Hongxin Hu
Venkat Krovi
Feng Luo
67
4
0
01 Dec 2020
A Survey on Deep Neural Network Compression: Challenges, Overview, and Solutions
Rahul Mishra
Hari Prabhat Gupta
Tanima Dutta
92
96
0
05 Oct 2020
Differential Replication in Machine Learning
Irene Unceta
Jordi Nin
O. Pujol
SyDa
61
1
0
15 Jul 2020
On the Demystification of Knowledge Distillation: A Residual Network Perspective
N. Jha
Rajat Saini
Sparsh Mittal
78
4
0
30 Jun 2020
Interpreting and Disentangling Feature Components of Various Complexity from DNNs
Jie Ren
Mingjie Li
Zexu Liu
Quanshi Zhang
CoGe
138
20
0
29 Jun 2020
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
694
3,365
0
09 Jun 2020
Previous
1
2