ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.09168
  4. Cited By
Residual Knowledge Distillation

Residual Knowledge Distillation

21 February 2020
Mengya Gao
Yujun Shen
Quanquan Li
Chen Change Loy
ArXiv (abs)PDFHTML

Papers citing "Residual Knowledge Distillation"

12 / 12 papers shown
Distilling Efficient Vision Transformers from CNNs for Semantic
  Segmentation
Distilling Efficient Vision Transformers from CNNs for Semantic SegmentationPattern Recognition (Pattern Recogn.), 2023
Xueye Zheng
Yunhao Luo
Pengyuan Zhou
Lin Wang
269
42
0
11 Oct 2023
Hilbert Distillation for Cross-Dimensionality Networks
Hilbert Distillation for Cross-Dimensionality NetworksNeural Information Processing Systems (NeurIPS), 2022
Dian Qin
Haishuai Wang
Yanfeng Guo
Hongjia Xu
Sheng Zhou
Jiajun Bu
297
7
0
08 Nov 2022
Efficient Knowledge Distillation from Model Checkpoints
Efficient Knowledge Distillation from Model CheckpointsNeural Information Processing Systems (NeurIPS), 2022
Chaofei Wang
Qisen Yang
Rui Huang
Qing Xiao
Gao Huang
FedML
245
50
0
12 Oct 2022
Reducing Capacity Gap in Knowledge Distillation with Review Mechanism
  for Crowd Counting
Reducing Capacity Gap in Knowledge Distillation with Review Mechanism for Crowd Counting
Yunxin Liu
Qiaosi Yi
Jinshan Zeng
198
4
0
11 Jun 2022
CoCoFL: Communication- and Computation-Aware Federated Learning via
  Partial NN Freezing and Quantization
CoCoFL: Communication- and Computation-Aware Federated Learning via Partial NN Freezing and Quantization
Kilian Pfeiffer
Martin Rapp
R. Khalili
J. Henkel
FedML
387
19
0
10 Mar 2022
Learn From the Past: Experience Ensemble Knowledge Distillation
Learn From the Past: Experience Ensemble Knowledge DistillationInternational Conference on Pattern Recognition (ICPR), 2022
Chaofei Wang
Shaowei Zhang
Qing Xiao
Gao Huang
328
8
0
25 Feb 2022
Controlling the Quality of Distillation in Response-Based Network
  Compression
Controlling the Quality of Distillation in Response-Based Network Compression
Vibhas Kumar Vats
David J. Crandall
136
2
0
19 Dec 2021
Compacting Deep Neural Networks for Internet of Things: Methods and
  Applications
Compacting Deep Neural Networks for Internet of Things: Methods and ApplicationsIEEE Internet of Things Journal (IEEE IoT Journal), 2021
Ke Zhang
Hanbo Ying
Hongning Dai
Lin Li
Yuangyuang Peng
Keyi Guo
Hongfang Yu
304
48
0
20 Mar 2021
Distilling Knowledge via Intermediate Classifiers
Distilling Knowledge via Intermediate Classifiers
Aryan Asadian
Amirali Salehi-Abari
211
1
0
28 Feb 2021
Even your Teacher Needs Guidance: Ground-Truth Targets Dampen
  Regularization Imposed by Self-Distillation
Even your Teacher Needs Guidance: Ground-Truth Targets Dampen Regularization Imposed by Self-DistillationNeural Information Processing Systems (NeurIPS), 2021
Kenneth Borup
L. Andersen
237
17
0
25 Feb 2021
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
2.2K
4,072
0
09 Jun 2020
Channel Distillation: Channel-Wise Attention for Knowledge Distillation
Channel Distillation: Channel-Wise Attention for Knowledge Distillation
Zaida Zhou
Chaoran Zhuge
Xinwei Guan
Wen Liu
256
58
0
02 Jun 2020
1
Page 1 of 1