ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2006.01683
  4. Cited By
Channel Distillation: Channel-Wise Attention for Knowledge Distillation

Channel Distillation: Channel-Wise Attention for Knowledge Distillation

2 June 2020
Zaida Zhou
Chaoran Zhuge
Xinwei Guan
Wen Liu
ArXivPDFHTML

Papers citing "Channel Distillation: Channel-Wise Attention for Knowledge Distillation"

4 / 4 papers shown
Title
Data-free Knowledge Distillation for Fine-grained Visual Categorization
Data-free Knowledge Distillation for Fine-grained Visual Categorization
Renrong Shao
Wei Zhang
Jianhua Yin
Jun Wang
31
2
0
18 Apr 2024
Attention-guided Feature Distillation for Semantic Segmentation
Attention-guided Feature Distillation for Semantic Segmentation
Amir M. Mansourian
Arya Jalali
Rozhan Ahmadi
S. Kasaei
28
0
0
08 Mar 2024
Student-friendly Knowledge Distillation
Student-friendly Knowledge Distillation
Mengyang Yuan
Bo Lang
Fengnan Quan
18
17
0
18 May 2023
Multi scale Feature Extraction and Fusion for Online Knowledge
  Distillation
Multi scale Feature Extraction and Fusion for Online Knowledge Distillation
Panpan Zou
Yinglei Teng
Tao Niu
24
3
0
16 Jun 2022
1