ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2003.03622
  4. Cited By
Explaining Knowledge Distillation by Quantifying the Knowledge

Explaining Knowledge Distillation by Quantifying the Knowledge

Computer Vision and Pattern Recognition (CVPR), 2025
7 March 2020
Xu Cheng
Zhefan Rao
Yilan Chen
Quanshi Zhang
ArXiv (abs)PDFHTML

Papers citing "Explaining Knowledge Distillation by Quantifying the Knowledge"

10 / 60 papers shown
Title
Hands-on Guidance for Distilling Object Detectors
Hands-on Guidance for Distilling Object Detectors
Yangyang Qin
H. Ling
Zhenghai He
Yuxuan Shi
Lei Wu
ObjDFedML
64
1
0
26 Mar 2021
Joint framework with deep feature distillation and adaptive focal loss
  for weakly supervised audio tagging and acoustic event detection
Joint framework with deep feature distillation and adaptive focal loss for weakly supervised audio tagging and acoustic event detection
Yunhao Liang
Yanhua Long
Yijie Li
Jiaen Liang
Yuping Wang
87
10
0
23 Mar 2021
Student Network Learning via Evolutionary Knowledge Distillation
Student Network Learning via Evolutionary Knowledge Distillation
Kangkai Zhang
Chunhui Zhang
Shikun Li
Dan Zeng
Shiming Ge
103
93
0
23 Mar 2021
Spectral Roll-off Points Variations: Exploring Useful Information in
  Feature Maps by Its Variations
Spectral Roll-off Points Variations: Exploring Useful Information in Feature Maps by Its Variations
Yunkai Yu
Yuyang You
Zhihong Yang
Guozheng Liu
Peiyao Li
Zhicheng Yang
Wenjing Shan
91
2
0
31 Jan 2021
Multi-level Knowledge Distillation via Knowledge Alignment and
  Correlation
Multi-level Knowledge Distillation via Knowledge Alignment and Correlation
Fei Ding
Yin Yang
Hongxin Hu
Venkat Krovi
Feng Luo
67
4
0
01 Dec 2020
A Survey on Deep Neural Network Compression: Challenges, Overview, and
  Solutions
A Survey on Deep Neural Network Compression: Challenges, Overview, and Solutions
Rahul Mishra
Hari Prabhat Gupta
Tanima Dutta
92
96
0
05 Oct 2020
Differential Replication in Machine Learning
Differential Replication in Machine Learning
Irene Unceta
Jordi Nin
O. Pujol
SyDa
61
1
0
15 Jul 2020
On the Demystification of Knowledge Distillation: A Residual Network
  Perspective
On the Demystification of Knowledge Distillation: A Residual Network Perspective
N. Jha
Rajat Saini
Sparsh Mittal
78
4
0
30 Jun 2020
Interpreting and Disentangling Feature Components of Various Complexity
  from DNNs
Interpreting and Disentangling Feature Components of Various Complexity from DNNs
Jie Ren
Mingjie Li
Zexu Liu
Quanshi Zhang
CoGe
138
20
0
29 Jun 2020
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
694
3,365
0
09 Jun 2020
Previous
12