Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
1910.10699
Cited By
v1
v2
v3 (latest)
Contrastive Representation Distillation
International Conference on Learning Representations (ICLR), 2019
23 October 2019
Yonglong Tian
Dilip Krishnan
Phillip Isola
Re-assign community
ArXiv (abs)
PDF
HTML
Github (2336★)
Papers citing
"Contrastive Representation Distillation"
50 / 686 papers shown
Title
Triplet Knowledge Distillation
Xijun Wang
Dongyang Liu
Meina Kan
Chunrui Han
Zhongqin Wu
Shiguang Shan
114
3
0
25 May 2023
VanillaKD: Revisit the Power of Vanilla Knowledge Distillation from Small Scale to Large Scale
Zhiwei Hao
Jianyuan Guo
Kai Han
Han Hu
Chang Xu
Yunhe Wang
138
19
0
25 May 2023
On the Impact of Knowledge Distillation for Model Interpretability
International Conference on Machine Learning (ICML), 2023
Hyeongrok Han
Siwon Kim
Hyun-Soo Choi
Sungroh Yoon
163
11
0
25 May 2023
Knowledge Diffusion for Distillation
Neural Information Processing Systems (NeurIPS), 2023
Tao Huang
Yuan Zhang
Mingkai Zheng
Shan You
Fei Wang
Chao Qian
Chang Xu
243
83
0
25 May 2023
Decoupled Kullback-Leibler Divergence Loss
Neural Information Processing Systems (NeurIPS), 2023
Jiequan Cui
Zhuotao Tian
Zhisheng Zhong
Xiaojuan Qi
Bei Yu
Hanwang Zhang
201
66
0
23 May 2023
NORM: Knowledge Distillation via N-to-One Representation Matching
International Conference on Learning Representations (ICLR), 2023
Xiaolong Liu
Lujun Li
Chao Li
Anbang Yao
171
89
0
23 May 2023
Revisiting Data Augmentation in Model Compression: An Empirical and Comprehensive Study
IEEE International Joint Conference on Neural Network (IJCNN), 2023
Muzhou Yu
Linfeng Zhang
Kaisheng Ma
140
2
0
22 May 2023
Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?
Zheng Li
Yuxuan Li
Penghai Zhao
Renjie Song
Xiang Li
Jian Yang
161
24
0
22 May 2023
Student-friendly Knowledge Distillation
Knowledge-Based Systems (KBS), 2023
Mengyang Yuan
Bo Lang
Fengnan Quan
198
28
0
18 May 2023
DynamicKD: An Effective Knowledge Distillation via Dynamic Entropy Correction-Based Distillation for Gap Optimizing
Pattern Recognition (Pattern Recogn.), 2023
Songling Zhu
Ronghua Shang
Bo Yuan
Weitong Zhang
Yangyang Li
Licheng Jiao
115
9
0
09 May 2023
Do Not Blindly Imitate the Teacher: Using Perturbed Loss for Knowledge Distillation
Rongzhi Zhang
Jiaming Shen
Tianqi Liu
Jia-Ling Liu
Michael Bendersky
Marc Najork
Chao Zhang
212
28
0
08 May 2023
Avatar Knowledge Distillation: Self-ensemble Teacher Paradigm with Uncertainty
ACM Multimedia (ACM MM), 2023
Yuan Zhang
Weihua Chen
Yichen Lu
Tao Huang
Xiuyu Sun
Jian Cao
318
11
0
04 May 2023
MolKD: Distilling Cross-Modal Knowledge in Chemical Reactions for Molecular Property Prediction
Liang Zeng
Lanqing Li
Jian Li
137
4
0
03 May 2023
On Uni-Modal Feature Learning in Supervised Multi-Modal Learning
International Conference on Machine Learning (ICML), 2023
Chenzhuang Du
Jiaye Teng
Tingle Li
Yichen Liu
Tianyuan Yuan
Yue Wang
Yang Yuan
Hang Zhao
318
70
0
02 May 2023
Long-Tailed Recognition by Mutual Information Maximization between Latent Features and Ground-Truth Labels
International Conference on Machine Learning (ICML), 2023
Min-Kook Suh
Seung-Woo Seo
SSL
279
27
0
02 May 2023
File Fragment Classification using Light-Weight Convolutional Neural Networks
IEEE Access (IEEE Access), 2023
Mustafa Ghaleb
K. Saaim
Muhamad Felemban
S. Al-Saleh
Ahmad S. Al-Mulhem
151
6
0
01 May 2023
Class Attention Transfer Based Knowledge Distillation
Computer Vision and Pattern Recognition (CVPR), 2023
Ziyao Guo
Haonan Yan
Hui Li
Xiao-La Lin
133
101
0
25 Apr 2023
Improving Knowledge Distillation via Transferring Learning Ability
Long Liu
Tong Li
Hui Cheng
124
1
0
24 Apr 2023
Function-Consistent Feature Distillation
International Conference on Learning Representations (ICLR), 2023
Dongyang Liu
Meina Kan
Shiguang Shan
Xilin Chen
216
25
0
24 Apr 2023
eTag: Class-Incremental Learning with Embedding Distillation and Task-Oriented Generation
Libo Huang
Yan Zeng
Chuanguang Yang
Zhulin An
Boyu Diao
Yongjun Xu
CLL
147
3
0
20 Apr 2023
Knowledge Distillation Under Ideal Joint Classifier Assumption
Neural Networks (Neural Netw.), 2023
Huayu Li
Xiwen Chen
G. Ditzler
Janet Roveda
Ao Li
128
2
0
19 Apr 2023
Deep Collective Knowledge Distillation
Jihyeon Seo
Kyusam Oh
Chanho Min
Yongkeun Yun
Sungwoo Cho
70
0
0
18 Apr 2023
Robust Cross-Modal Knowledge Distillation for Unconstrained Videos
Wenke Xia
Xingjian Li
Andong Deng
Haoyi Xiong
Dejing Dou
Di Hu
114
7
0
16 Apr 2023
Teacher Network Calibration Improves Cross-Quality Knowledge Distillation
Pia Cuk
Robin Senge
M. Lauri
Simone Frintrop
175
2
0
15 Apr 2023
Multi-Mode Online Knowledge Distillation for Self-Supervised Visual Representation Learning
Computer Vision and Pattern Recognition (CVPR), 2023
Kaiyou Song
Jin Xie
Shanyi Zhang
Zimeng Luo
240
36
0
13 Apr 2023
Homogenizing Non-IID datasets via In-Distribution Knowledge Distillation for Decentralized Learning
Deepak Ravikumar
Gobinda Saha
Sai Aparna Aketi
Kaushik Roy
204
5
0
09 Apr 2023
Towards Efficient Task-Driven Model Reprogramming with Foundation Models
Shoukai Xu
Jiangchao Yao
Ran Luo
Shuhai Zhang
Zihao Lian
Zhuliang Yu
Bo Han
Yaowei Wang
199
6
0
05 Apr 2023
Long-Tailed Visual Recognition via Self-Heterogeneous Integration with Knowledge Excavation
Computer Vision and Pattern Recognition (CVPR), 2023
Yang Jin
Mengke Li
Yang Lu
Yiu-ming Cheung
Hanzi Wang
201
46
0
03 Apr 2023
Decomposed Cross-modal Distillation for RGB-based Temporal Action Detection
Computer Vision and Pattern Recognition (CVPR), 2023
Pilhyeon Lee
Taeoh Kim
Minho Shim
Dongyoon Wee
H. Byun
184
13
0
30 Mar 2023
Information-Theoretic GAN Compression with Variational Energy-based Model
Neural Information Processing Systems (NeurIPS), 2023
Minsoo Kang
Hyewon Yoo
Eunhee Kang
Sehwan Ki
Hyong-Euk Lee
Bohyung Han
GAN
165
4
0
28 Mar 2023
Head3D: Complete 3D Head Generation via Tri-plane Feature Distillation
Y. Cheng
Manwen Liao
Wenhan Zhu
Ye Pan
Bowen Pan
Yunbo Wang
3DH
150
10
0
28 Mar 2023
DisWOT: Student Architecture Search for Distillation WithOut Training
Computer Vision and Pattern Recognition (CVPR), 2023
Peijie Dong
Lujun Li
Zimian Wei
143
77
0
28 Mar 2023
Prototype-Sample Relation Distillation: Towards Replay-Free Continual Learning
International Conference on Machine Learning (ICML), 2023
Nader Asadi
Mohammad Davar
Sudhir Mudur
Rahaf Aljundi
Eugene Belilovsky
CLL
154
55
0
26 Mar 2023
Disentangling Writer and Character Styles for Handwriting Generation
Computer Vision and Pattern Recognition (CVPR), 2023
Gang Dai
Yifan Zhang
Qingfeng Wang
Qing Du
Zhu Liang Yu
Zhuoman Liu
Shuangping Huang
217
43
0
26 Mar 2023
A Simple and Generic Framework for Feature Distillation via Channel-wise Transformation
Ziwei Liu
Yongtao Wang
Xiaojie Chu
240
11
0
23 Mar 2023
From Knowledge Distillation to Self-Knowledge Distillation: A Unified Approach with Normalized Loss and Customized Soft Labels
IEEE International Conference on Computer Vision (ICCV), 2023
Zhendong Yang
Ailing Zeng
Zhe Li
Tianke Zhang
Chun Yuan
Yu Li
244
112
0
23 Mar 2023
FeatureNeRF: Learning Generalizable NeRFs by Distilling Foundation Models
IEEE International Conference on Computer Vision (ICCV), 2023
Jianglong Ye
Naiyan Wang
Xinyu Wang
DiffM
199
51
0
22 Mar 2023
Understanding the Role of the Projector in Knowledge Distillation
AAAI Conference on Artificial Intelligence (AAAI), 2023
Roy Miles
K. Mikolajczyk
232
44
0
20 Mar 2023
Towards a Smaller Student: Capacity Dynamic Distillation for Efficient Image Retrieval
Computer Vision and Pattern Recognition (CVPR), 2023
Yi Xie
Huaidong Zhang
Xuemiao Xu
Jianqing Zhu
Shengfeng He
VLM
223
18
0
16 Mar 2023
Knowledge Distillation from Single to Multi Labels: an Empirical Study
Youcai Zhang
Yuzhuo Qin
Heng-Ye Liu
Yanhao Zhang
Yaqian Li
X. Gu
VLM
160
2
0
15 Mar 2023
MetaMixer: A Regularization Strategy for Online Knowledge Distillation
Maorong Wang
L. Xiao
T. Yamasaki
KELM
MoE
111
1
0
14 Mar 2023
MobileVOS: Real-Time Video Object Segmentation Contrastive Learning meets Knowledge Distillation
Computer Vision and Pattern Recognition (CVPR), 2023
Roy Miles
M. K. Yucel
Bruno Manganelli
Albert Saà-Garriga
VOS
155
33
0
14 Mar 2023
Data-Free Sketch-Based Image Retrieval
Computer Vision and Pattern Recognition (CVPR), 2023
Abhra Chaudhuri
A. Bhunia
Yi-Zhe Song
Anjan Dutta
200
12
0
14 Mar 2023
A Contrastive Knowledge Transfer Framework for Model Compression and Transfer Learning
IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), 2023
Kaiqi Zhao
Yitao Chen
Ming Zhao
VLM
143
2
0
14 Mar 2023
Learn More for Food Recognition via Progressive Self-Distillation
AAAI Conference on Artificial Intelligence (AAAI), 2023
Yaohui Zhu
Linhu Liu
Jiang Tian
123
8
0
09 Mar 2023
Leveraging Angular Distributions for Improved Knowledge Distillation
Eunyeong Jeon
Hongjun Choi
Ankita Shukla
Pavan Turaga
111
8
0
27 Feb 2023
LightTS: Lightweight Time Series Classification with Adaptive Ensemble Distillation -- Extended Version
David Campos
Miao Zhang
B. Yang
Tung Kieu
Chenjuan Guo
Christian S. Jensen
AI4TS
182
87
0
24 Feb 2023
Distilling Calibrated Student from an Uncalibrated Teacher
Ishan Mishra
Sethu Vamsi Krishna
Deepak Mishra
FedML
132
3
0
22 Feb 2023
URCDC-Depth: Uncertainty Rectified Cross-Distillation with CutFlip for Monocular Depth Estimation
IEEE transactions on multimedia (IEEE TMM), 2023
Shuwei Shao
Z. Pei
Weihai Chen
Ran Li
Zhong Liu
Zhengguo Li
ViT
UQCV
208
43
0
16 Feb 2023
SLaM: Student-Label Mixing for Distillation with Unlabeled Examples
Neural Information Processing Systems (NeurIPS), 2023
Vasilis Kontonis
Fotis Iliopoulos
Khoa Trinh
Cenk Baykal
Gaurav Menghani
Erik Vee
159
9
0
08 Feb 2023
Previous
1
2
3
...
5
6
7
...
12
13
14
Next