ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1812.00660
  4. Cited By
Knowledge Distillation with Feature Maps for Image Classification

Knowledge Distillation with Feature Maps for Image Classification

3 December 2018
Wei-Chun Chen
Chia-Che Chang
Chien-Yu Lu
Che-Rung Lee
ArXiv (abs)PDFHTML

Papers citing "Knowledge Distillation with Feature Maps for Image Classification"

19 / 19 papers shown
Title
A Novel Compression Framework for YOLOv8: Achieving Real-Time Aerial Object Detection on Edge Devices via Structured Pruning and Channel-Wise Distillation
A Novel Compression Framework for YOLOv8: Achieving Real-Time Aerial Object Detection on Edge Devices via Structured Pruning and Channel-Wise Distillation
Melika Sabaghian
Mohammad Ali Keyvanrad
Seyyedeh Mahila Moghadami
114
0
0
16 Sep 2025
AME: Aligned Manifold Entropy for Robust Vision-Language Distillation
AME: Aligned Manifold Entropy for Robust Vision-Language Distillation
Guiming Cao
Yuming Ou
AAMLVLM
131
2
0
12 Aug 2025
Improving Zero-shot Generalization of Learned Prompts via Unsupervised
  Knowledge Distillation
Improving Zero-shot Generalization of Learned Prompts via Unsupervised Knowledge Distillation
Marco Mistretta
Alberto Baldrati
Marco Bertini
Andrew D. Bagdanov
VPVLMVLM
388
19
0
03 Jul 2024
FedDr+: Stabilizing Dot-regression with Global Feature Distillation for
  Federated Learning
FedDr+: Stabilizing Dot-regression with Global Feature Distillation for Federated Learning
Seongyoon Kim
Minchan Jeong
Sungnyun Kim
Sungwoo Cho
Sumyeong Ahn
Se-Young Yun
FedML
283
3
0
04 Jun 2024
A Progressive Framework of Vision-language Knowledge Distillation and
  Alignment for Multilingual Scene
A Progressive Framework of Vision-language Knowledge Distillation and Alignment for Multilingual Scene
Wenbo Zhang
Yifan Zhang
Jianfeng Lin
Binqiang Huang
Jinlu Zhang
Wenhao Yu
VLM
177
2
0
17 Apr 2024
Scheduled Knowledge Acquisition on Lightweight Vector Symbolic
  Architectures for Brain-Computer Interfaces
Scheduled Knowledge Acquisition on Lightweight Vector Symbolic Architectures for Brain-Computer Interfaces
Yejia Liu
Shijin Duan
Xiaolin Xu
Shaolei Ren
228
1
0
18 Mar 2024
Spectral Co-Distillation for Personalized Federated Learning
Spectral Co-Distillation for Personalized Federated Learning
Zihan Chen
Howard H. Yang
Tony Q.S. Quek
Kai Fong Ernest Chong
OODFedML
205
22
0
29 Jan 2024
Teacher-Student Architecture for Knowledge Distillation: A Survey
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
228
38
0
08 Aug 2023
Knowledge Distillation in Vision Transformers: A Critical Review
Knowledge Distillation in Vision Transformers: A Critical Review
Gousia Habib
Tausifa Jan Saleem
Brejesh Lall
236
22
0
04 Feb 2023
Leveraging Different Learning Styles for Improved Knowledge Distillation
  in Biomedical Imaging
Leveraging Different Learning Styles for Improved Knowledge Distillation in Biomedical Imaging
Usma Niyaz
A. Sambyal
Deepti R. Bathula
178
2
0
06 Dec 2022
Improving Neural Cross-Lingual Summarization via Employing Optimal
  Transport Distance for Knowledge Distillation
Improving Neural Cross-Lingual Summarization via Employing Optimal Transport Distance for Knowledge DistillationAAAI Conference on Artificial Intelligence (AAAI), 2021
Thong Nguyen
Anh Tuan Luu
163
45
0
07 Dec 2021
Knowledge Distillation as Semiparametric Inference
Knowledge Distillation as Semiparametric InferenceInternational Conference on Learning Representations (ICLR), 2021
Tri Dao
G. Kamath
Vasilis Syrgkanis
Lester W. Mackey
174
34
0
20 Apr 2021
Resolution-Based Distillation for Efficient Histology Image
  Classification
Resolution-Based Distillation for Efficient Histology Image Classification
Joseph DiPalma
A. Suriawinata
L. Tafe
Lorenzo Torresani
Saeed Hassanpour
167
39
0
11 Jan 2021
Domain Adaptive Knowledge Distillation for Driving Scene Semantic
  Segmentation
Domain Adaptive Knowledge Distillation for Driving Scene Semantic Segmentation
D. Kothandaraman
Athira M. Nambiar
Anurag Mittal
CLL
120
27
0
03 Nov 2020
ProxylessKD: Direct Knowledge Distillation with Inherited Classifier for
  Face Recognition
ProxylessKD: Direct Knowledge Distillation with Inherited Classifier for Face Recognition
W. Shi
Guanghui Ren
Yunpeng Chen
Shuicheng Yan
CVBM
131
7
0
31 Oct 2020
Malaria detection from RBC images using shallow Convolutional Neural
  Networks
Malaria detection from RBC images using shallow Convolutional Neural Networks
S. Sarkar
Rati Sharma
Kushal Shah
36
11
0
22 Oct 2020
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
1.4K
3,592
0
09 Jun 2020
Preparing Lessons: Improve Knowledge Distillation with Better
  Supervision
Preparing Lessons: Improve Knowledge Distillation with Better Supervision
Tiancheng Wen
Shenqi Lai
Xueming Qian
352
75
0
18 Nov 2019
Adversarially Robust Distillation
Adversarially Robust DistillationAAAI Conference on Artificial Intelligence (AAAI), 2019
Micah Goldblum
Liam H. Fowl
Soheil Feizi
Tom Goldstein
AAML
188
238
0
23 May 2019
1