Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2002.09168
Cited By
Residual Knowledge Distillation
21 February 2020
Mengya Gao
Yujun Shen
Quanquan Li
Chen Change Loy
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Residual Knowledge Distillation"
12 / 12 papers shown
Title
Distilling Efficient Vision Transformers from CNNs for Semantic Segmentation
Xueye Zheng
Yunhao Luo
Pengyuan Zhou
Lin Wang
27
12
0
11 Oct 2023
Hilbert Distillation for Cross-Dimensionality Networks
Dian Qin
Haishuai Wang
Zhe Liu
Hongjia Xu
Sheng Zhou
Jiajun Bu
21
4
0
08 Nov 2022
Efficient Knowledge Distillation from Model Checkpoints
Chaofei Wang
Qisen Yang
Rui Huang
S. Song
Gao Huang
FedML
6
35
0
12 Oct 2022
Reducing Capacity Gap in Knowledge Distillation with Review Mechanism for Crowd Counting
Yunxin Liu
Qiaosi Yi
Jinshan Zeng
15
1
0
11 Jun 2022
CoCoFL: Communication- and Computation-Aware Federated Learning via Partial NN Freezing and Quantization
Kilian Pfeiffer
Martin Rapp
R. Khalili
J. Henkel
FedML
11
11
0
10 Mar 2022
Learn From the Past: Experience Ensemble Knowledge Distillation
Chaofei Wang
Shaowei Zhang
S. Song
Gao Huang
25
4
0
25 Feb 2022
Controlling the Quality of Distillation in Response-Based Network Compression
Vibhas Kumar Vats
David J. Crandall
11
1
0
19 Dec 2021
Compacting Deep Neural Networks for Internet of Things: Methods and Applications
Ke Zhang
Hanbo Ying
Hongning Dai
Lin Li
Yuangyuang Peng
Keyi Guo
Hongfang Yu
16
38
0
20 Mar 2021
Distilling Knowledge via Intermediate Classifiers
Aryan Asadian
Amirali Salehi-Abari
27
1
0
28 Feb 2021
Even your Teacher Needs Guidance: Ground-Truth Targets Dampen Regularization Imposed by Self-Distillation
Kenneth Borup
L. Andersen
17
14
0
25 Feb 2021
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,835
0
09 Jun 2020
Channel Distillation: Channel-Wise Attention for Knowledge Distillation
Zaida Zhou
Chaoran Zhuge
Xinwei Guan
Wen Liu
6
49
0
02 Jun 2020
1