Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2304.12777
Cited By
Class Attention Transfer Based Knowledge Distillation
25 April 2023
Ziyao Guo
Haonan Yan
Hui Li
Xiao-La Lin
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Class Attention Transfer Based Knowledge Distillation"
10 / 10 papers shown
Title
Swapped Logit Distillation via Bi-level Teacher Alignment
Stephen Ekaputra Limantoro
Jhe-Hao Lin
Chih-Yu Wang
Yi-Lung Tsai
Hong-Han Shuai
Ching-Chun Huang
Wen-Huang Cheng
54
0
0
27 Apr 2025
HDC: Hierarchical Distillation for Multi-level Noisy Consistency in Semi-Supervised Fetal Ultrasound Segmentation
Tran Quoc Khanh Le
Nguyen Lan Vi Vu
Ha-Hieu Pham
Xuan-Loc Huynh
T. Nguyen
Minh Huu Nhat Le
Quan Nguyen
Hien Nguyen
43
0
0
14 Apr 2025
VRM: Knowledge Distillation via Virtual Relation Matching
W. Zhang
Fei Xie
Weidong Cai
Chao Ma
76
0
0
28 Feb 2025
Contrastive Representation Distillation via Multi-Scale Feature Decoupling
Cuipeng Wang
Tieyuan Chen
Haipeng Wang
49
0
0
09 Feb 2025
Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective
Jinjing Zhu
Songze Li
Lin Wang
47
0
0
13 Jan 2025
Instance Temperature Knowledge Distillation
Zhengbo Zhang
Yuxi Zhou
Jia Gong
Jun Liu
Zhigang Tu
21
2
0
27 Jun 2024
CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective
Wencheng Zhu
Xin Zhou
Pengfei Zhu
Yu Wang
Qinghua Hu
VLM
56
1
0
22 Apr 2024
Review helps learn better: Temporal Supervised Knowledge Distillation
Dongwei Wang
Zhi-Long Han
Yanmei Wang
Xi’ai Chen
Baicheng Liu
Yandong Tang
57
1
0
03 Jul 2023
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
149
420
0
19 Apr 2021
Densely Connected Convolutional Networks
Gao Huang
Zhuang Liu
L. V. D. van der Maaten
Kilian Q. Weinberger
PINN
3DV
255
36,362
0
25 Aug 2016
1