Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2006.01683
Cited By
Channel Distillation: Channel-Wise Attention for Knowledge Distillation
2 June 2020
Zaida Zhou
Chaoran Zhuge
Xinwei Guan
Wen Liu
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Channel Distillation: Channel-Wise Attention for Knowledge Distillation"
4 / 4 papers shown
Title
Data-free Knowledge Distillation for Fine-grained Visual Categorization
Renrong Shao
Wei Zhang
Jianhua Yin
Jun Wang
31
2
0
18 Apr 2024
Attention-guided Feature Distillation for Semantic Segmentation
Amir M. Mansourian
Arya Jalali
Rozhan Ahmadi
S. Kasaei
28
0
0
08 Mar 2024
Student-friendly Knowledge Distillation
Mengyang Yuan
Bo Lang
Fengnan Quan
18
17
0
18 May 2023
Multi scale Feature Extraction and Fusion for Online Knowledge Distillation
Panpan Zou
Yinglei Teng
Tao Niu
24
3
0
16 Jun 2022
1