Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2202.03680
Cited By
Exploring Inter-Channel Correlation for Diversity-preserved KnowledgeDistillation
8 February 2022
Li Liu
Qingle Huang
Sihao Lin
Hongwei Xie
Bing Wang
Xiaojun Chang
Xiao-Xue Liang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Exploring Inter-Channel Correlation for Diversity-preserved KnowledgeDistillation"
42 / 42 papers shown
Title
VRM: Knowledge Distillation via Virtual Relation Matching
W. Zhang
Fei Xie
Weidong Cai
Chao Ma
71
0
0
28 Feb 2025
I2CKD : Intra- and Inter-Class Knowledge Distillation for Semantic Segmentation
Ayoub Karine
Thibault Napoléon
M. Jridi
VLM
101
0
0
24 Feb 2025
Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective
Jinjing Zhu
Songze Li
Lin Wang
47
0
0
13 Jan 2025
Cross-View Consistency Regularisation for Knowledge Distillation
W. Zhang
Dongnan Liu
Weidong Cai
Chao Ma
68
1
0
21 Dec 2024
Wasserstein Distance Rivals Kullback-Leibler Divergence for Knowledge Distillation
Jiaming Lv
Haoyuan Yang
P. Li
74
1
0
11 Dec 2024
Preview-based Category Contrastive Learning for Knowledge Distillation
Muhe Ding
Jianlong Wu
Xue Dong
Xiaojie Li
Pengda Qin
Tian Gan
Liqiang Nie
VLM
24
0
0
18 Oct 2024
Fully Exploiting Every Real Sample: SuperPixel Sample Gradient Model Stealing
Yunlong Zhao
Xiaoheng Deng
Yijing Liu
Xin-jun Pei
Jiazhi Xia
Wei Chen
AAML
37
3
0
18 May 2024
Exploring Graph-based Knowledge: Multi-Level Feature Distillation via Channels Relational Graph
Zhiwei Wang
Jun Huang
Longhua Ma
Chengyu Wu
Hongyu Ma
35
0
0
14 May 2024
From Algorithm to Hardware: A Survey on Efficient and Safe Deployment of Deep Neural Networks
Xue Geng
Zhe Wang
Chunyun Chen
Qing Xu
Kaixin Xu
...
Zhenghua Chen
M. Aly
Jie Lin
Min-man Wu
Xiaoli Li
33
1
0
09 May 2024
CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective
Wencheng Zhu
Xin Zhou
Pengfei Zhu
Yu Wang
Qinghua Hu
VLM
56
1
0
22 Apr 2024
Attention-guided Feature Distillation for Semantic Segmentation
Amir M. Mansourian
Arya Jalali
Rozhan Ahmadi
S. Kasaei
28
0
0
08 Mar 2024
Rethinking Centered Kernel Alignment in Knowledge Distillation
Zikai Zhou
Yunhang Shen
Shitong Shao
Linrui Gong
Shaohui Lin
24
1
0
22 Jan 2024
Bayes Conditional Distribution Estimation for Knowledge Distillation Based on Conditional Mutual Information
Linfeng Ye
Shayan Mohajer Hamidi
Renhao Tan
En-Hui Yang
VLM
27
12
0
16 Jan 2024
RdimKD: Generic Distillation Paradigm by Dimensionality Reduction
Yi Guo
Yiqian He
Xiaoyang Li
Haotong Qin
Van Tung Pham
Yang Zhang
Shouda Liu
37
1
0
14 Dec 2023
torchdistill Meets Hugging Face Libraries for Reproducible, Coding-Free Deep Learning Studies: A Case Study on NLP
Yoshitomo Matsubara
VLM
13
1
0
26 Oct 2023
Distillation Improves Visual Place Recognition for Low Quality Images
Anbang Yang
Yao Wang
John-Ross Rizzo
Chen Feng
John-Ross Rizzo
Chen Feng
29
0
0
10 Oct 2023
AICSD: Adaptive Inter-Class Similarity Distillation for Semantic Segmentation
Amir M. Mansourian
Rozhan Ahmadi
S. Kasaei
36
2
0
08 Aug 2023
EMQ: Evolving Training-free Proxies for Automated Mixed Precision Quantization
Peijie Dong
Lujun Li
Zimian Wei
Xin-Yi Niu
Zhiliang Tian
H. Pan
MQ
40
28
0
20 Jul 2023
Distilling Universal and Joint Knowledge for Cross-Domain Model Compression on Time Series Data
Qing Xu
Min-man Wu
Xiaoli Li
K. Mao
Zhenghua Chen
14
5
0
07 Jul 2023
Cross Architecture Distillation for Face Recognition
Weisong Zhao
Xiangyu Zhu
Zhixiang He
Xiaoyu Zhang
Zhen Lei
CVBM
15
6
0
26 Jun 2023
Categories of Response-Based, Feature-Based, and Relation-Based Knowledge Distillation
Chuanguang Yang
Xinqiang Yu
Zhulin An
Yongjun Xu
VLM
OffRL
86
21
0
19 Jun 2023
Towards Medical Artificial General Intelligence via Knowledge-Enhanced Multimodal Pretraining
Bingqian Lin
Zicong Chen
Mingjie Li
Haokun Lin
Hang Xu
...
Ling-Hao Chen
Xiaojun Chang
Yi Yang
L. Xing
Xiaodan Liang
LM&MA
MedIm
AI4CE
38
14
0
26 Apr 2023
Function-Consistent Feature Distillation
Dongyang Liu
Meina Kan
Shiguang Shan
Xilin Chen
44
18
0
24 Apr 2023
A Survey on Approximate Edge AI for Energy Efficient Autonomous Driving Services
Dewant Katare
Diego Perino
J. Nurmi
M. Warnier
Marijn Janssen
Aaron Yi Ding
34
36
0
13 Apr 2023
A Survey on Recent Teacher-student Learning Studies
Min Gao
18
3
0
10 Apr 2023
DisWOT: Student Architecture Search for Distillation WithOut Training
Peijie Dong
Lujun Li
Zimian Wei
33
56
0
28 Mar 2023
Understanding the Role of the Projector in Knowledge Distillation
Roy Miles
K. Mikolajczyk
14
21
0
20 Mar 2023
Towards a Smaller Student: Capacity Dynamic Distillation for Efficient Image Retrieval
Yi Xie
Huaidong Zhang
Xuemiao Xu
Jianqing Zhu
Shengfeng He
VLM
13
13
0
16 Mar 2023
TiG-BEV: Multi-view BEV 3D Object Detection via Target Inner-Geometry Learning
Pei-Kai Huang
L. Liu
Renrui Zhang
Song Zhang
Xin Xu
Bai-Qi Wang
G. Liu
3DPC
MDE
32
42
0
28 Dec 2022
Understanding the Role of Mixup in Knowledge Distillation: An Empirical Study
Hongjun Choi
Eunyeong Jeon
Ankita Shukla
P. Turaga
10
7
0
08 Nov 2022
Learning Knowledge Representation with Meta Knowledge Distillation for Single Image Super-Resolution
Han Zhu
Zhenzhong Chen
Shan Liu
SupR
9
3
0
18 Jul 2022
Normalized Feature Distillation for Semantic Segmentation
Tao Liu
Xi Yang
Chenshu Chen
9
5
0
12 Jul 2022
SERE: Exploring Feature Self-relation for Self-supervised Transformer
Zhong-Yu Li
Shanghua Gao
Ming-Ming Cheng
ViT
MDE
23
14
0
10 Jun 2022
Cross-modal Clinical Graph Transformer for Ophthalmic Report Generation
Mingjie Li
Wenjia Cai
Karin Verspoor
Shirui Pan
Xiaodan Liang
Xiaojun Chang
MedIm
28
35
0
04 Jun 2022
Knowledge Distillation via the Target-aware Transformer
Sihao Lin
Hongwei Xie
Bing Wang
Kaicheng Yu
Xiaojun Chang
Xiaodan Liang
G. Wang
ViT
20
104
0
22 May 2022
TransKD: Transformer Knowledge Distillation for Efficient Semantic Segmentation
R. Liu
Kailun Yang
Alina Roitberg
Jiaming Zhang
Kunyu Peng
Huayao Liu
Yaonan Wang
Rainer Stiefelhagen
ViT
39
36
0
27 Feb 2022
Knowledge Distillation with Deep Supervision
Shiya Luo
Defang Chen
Can Wang
16
1
0
16 Feb 2022
Data-Free Knowledge Transfer: A Survey
Yuang Liu
Wei Zhang
Jun Wang
Jianyong Wang
27
48
0
31 Dec 2021
Pixel Distillation: A New Knowledge Distillation Scheme for Low-Resolution Image Recognition
Guangyu Guo
Dingwen Zhang
Longfei Han
Nian Liu
Ming-Ming Cheng
Junwei Han
18
2
0
17 Dec 2021
Information Theoretic Representation Distillation
Roy Miles
Adrian Lopez-Rodriguez
K. Mikolajczyk
MQ
13
21
0
01 Dec 2021
Optimizing for In-memory Deep Learning with Emerging Memory Technology
Zhehui Wang
Tao Luo
Rick Siow Mong Goh
Wei Zhang
Weng-Fai Wong
13
1
0
01 Dec 2021
Show, Attend and Distill:Knowledge Distillation via Attention-based Feature Matching
Mingi Ji
Byeongho Heo
Sungrae Park
62
143
0
05 Feb 2021
1