Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
1910.10699
Cited By
v1
v2
v3 (latest)
Contrastive Representation Distillation
International Conference on Learning Representations (ICLR), 2019
23 October 2019
Yonglong Tian
Dilip Krishnan
Phillip Isola
Re-assign community
ArXiv (abs)
PDF
HTML
Github (2336★)
Papers citing
"Contrastive Representation Distillation"
50 / 686 papers shown
Title
Learning Unified Representations for Multi-Resolution Face Recognition
Hulingxiao He
Wu Yuan
Yidian Huang
Shilong Zhao
Wen Yuan
Hanqin Li
CVBM
123
0
0
14 Oct 2023
Towards the Fundamental Limits of Knowledge Transfer over Finite Domains
International Conference on Learning Representations (ICLR), 2023
Qingyue Zhao
Banghua Zhu
344
5
0
11 Oct 2023
Fast and Robust Early-Exiting Framework for Autoregressive Language Models with Synchronized Parallel Decoding
Conference on Empirical Methods in Natural Language Processing (EMNLP), 2023
Sangmin Bae
Jongwoo Ko
Hwanjun Song
SeYoung Yun
190
76
0
09 Oct 2023
OpenIncrement: A Unified Framework for Open Set Recognition and Deep Class-Incremental Learning
Jiawen Xu
Claas Grohnfeldt
O. Kao
BDL
CLL
206
2
0
05 Oct 2023
LumiNet: Perception-Driven Knowledge Distillation via Statistical Logit Calibration
M. Hossain
M. M. L. Elahi
Sameera Ramasinghe
A. Cheraghian
Fuad Rahman
Nabeel Mohammed
Shafin Rahman
274
1
0
05 Oct 2023
Continual Contrastive Spoken Language Understanding
Annual Meeting of the Association for Computational Linguistics (ACL), 2023
Umberto Cappellazzo
Enrico Fini
Muqiao Yang
Daniele Falavigna
Alessio Brutti
Bhiksha Raj
CLL
251
1
0
04 Oct 2023
Improving Knowledge Distillation with Teacher's Explanation
S. Chowdhury
Ben Liang
A. Tizghadam
Ilijc Albanese
FAtt
108
1
0
04 Oct 2023
Heterogeneous Federated Learning Using Knowledge Codistillation
Jared Lichtarge
Ehsan Amid
Shankar Kumar
Tien-Ju Yang
Rohan Anil
Rajiv Mathews
FedML
195
0
0
04 Oct 2023
Generalizable Heterogeneous Federated Cross-Correlation and Instance Similarity Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 2023
Wenke Huang
J. J. Valero-Mas
Dasaem Jeong
Bo Du
FedML
168
72
0
28 Sep 2023
VideoAdviser: Video Knowledge Distillation for Multimodal Transfer Learning
IEEE Access (IEEE Access), 2023
Yanan Wang
Donghuo Zeng
Shinya Wada
Satoshi Kurihara
136
10
0
27 Sep 2023
Weight Averaging Improves Knowledge Distillation under Domain Shift
Valeriy Berezovskiy
Nikita Morozov
MoMe
204
2
0
20 Sep 2023
Heterogeneous Generative Knowledge Distillation with Masked Image Modeling
Ziming Wang
Shumin Han
Xiaodi Wang
Jing Hao
Xianbin Cao
Baochang Zhang
VLM
184
1
0
18 Sep 2023
Quality-Agnostic Deepfake Detection with Intra-model Collaborative Learning
IEEE International Conference on Computer Vision (ICCV), 2023
B. Le
Simon S. Woo
AAML
180
45
0
12 Sep 2023
3D Denoisers are Good 2D Teachers: Molecular Pretraining via Denoising and Cross-Modal Distillation
AAAI Conference on Artificial Intelligence (AAAI), 2023
Sungjun Cho
Dae-Woong Jeong
Sung Moon Ko
Jinwoo Kim
Sehui Han
Seunghoon Hong
Honglak Lee
Moontae Lee
AI4CE
DiffM
171
1
0
08 Sep 2023
Rethinking Momentum Knowledge Distillation in Online Continual Learning
International Conference on Machine Learning (ICML), 2023
Nicolas Michel
Maorong Wang
L. Xiao
T. Yamasaki
CLL
190
16
0
06 Sep 2023
Knowledge Distillation Layer that Lets the Student Decide
British Machine Vision Conference (BMVC), 2023
Ada Gorgun
Y. Z. Gürbüz
A. Aydin Alatan
159
0
0
06 Sep 2023
Code Representation Pre-training with Complements from Program Executions
Jiabo Huang
Jianyu Zhao
Yuyang Rong
Yiwen Guo
Yifeng He
Hao Chen
249
6
0
04 Sep 2023
MoMA: Momentum Contrastive Learning with Multi-head Attention-based Knowledge Distillation for Histopathology Image Analysis
T. Vuong
J. T. Kwak
179
10
0
31 Aug 2023
SynthDistill: Face Recognition with Knowledge Distillation from Synthetic Data
Hatef Otroshi
Anjith George
S´ebastien Marcel
165
15
0
28 Aug 2023
Efficient View Synthesis with Neural Radiance Distribution Field
IEEE International Conference on Computer Vision (ICCV), 2023
Yushuang Wu
Xiao Li
Jinglu Wang
Xiaoguang Han
Shuguang Cui
Yan Lu
210
2
0
22 Aug 2023
Diffusion Model as Representation Learner
IEEE International Conference on Computer Vision (ICCV), 2023
Xingyi Yang
Xinchao Wang
DiffM
191
80
0
21 Aug 2023
GiGaMAE: Generalizable Graph Masked Autoencoder via Collaborative Latent Space Reconstruction
International Conference on Information and Knowledge Management (CIKM), 2023
Yucheng Shi
Yushun Dong
Qiaoyu Tan
Jundong Li
Ninghao Liu
277
37
0
18 Aug 2023
Unlimited Knowledge Distillation for Action Recognition in the Dark
Ruibing Jin
Guosheng Lin
Ruibing Jin
Jie Lin
Zhengguo Li
Xiaoli Li
Zhenghua Chen
115
2
0
18 Aug 2023
CCFace: Classification Consistency for Low-Resolution Face Recognition
Mohammad Saeed Ebrahimi Saadabadi
Sahar Rahimi Malakshan
Hossein Kashiani
Nasser M. Nasrabadi
CVBM
145
4
0
18 Aug 2023
ZhiJian: A Unifying and Rapidly Deployable Toolbox for Pre-trained Model Reuse
Yi-Kai Zhang
Lu Ren
Chao Yi
Qiwen Wang
De-Chuan Zhan
Han-Jia Ye
119
3
0
17 Aug 2023
Learning to Distill Global Representation for Sparse-View CT
IEEE International Conference on Computer Vision (ICCV), 2023
Zilong Li
Chenglong Ma
Jie Chen
Junping Zhang
Hongming Shan
178
17
0
16 Aug 2023
Story Visualization by Online Text Augmentation with Context Memory
IEEE International Conference on Computer Vision (ICCV), 2023
Daechul Ahn
Daneul Kim
Gwangmo Song
Seung Wook Kim
Honglak Lee
Luan Tuyen Chau
Jonghyun Choi
DiffM
224
8
0
15 Aug 2023
Estimator Meets Equilibrium Perspective: A Rectified Straight Through Estimator for Binary Neural Networks Training
IEEE International Conference on Computer Vision (ICCV), 2023
Xiao-Ming Wu
Dian Zheng
Zuhao Liu
Weishi Zheng
MQ
255
25
0
13 Aug 2023
Multi-Label Knowledge Distillation
IEEE International Conference on Computer Vision (ICCV), 2023
Penghui Yang
Ming-Kun Xie
Chen-Chen Zong
Lei Feng
Gang Niu
Masashi Sugiyama
Sheng-Jun Huang
196
12
0
12 Aug 2023
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
228
38
0
08 Aug 2023
NormKD: Normalized Logits for Knowledge Distillation
Zhihao Chi
Tu Zheng
Hengjia Li
Zheng Yang
Boxi Wu
Binbin Lin
Xiaofei He
228
22
0
01 Aug 2023
Sampling to Distill: Knowledge Transfer from Open-World Data
ACM Multimedia (ACM MM), 2023
Yuzheng Wang
Zhaoyu Chen
Jie M. Zhang
Dingkang Yang
Zuhao Ge
Yang Liu
Siao Liu
Yunquan Sun
Wenqiang Zhang
Lizhe Qi
143
13
0
31 Jul 2023
Class-relation Knowledge Distillation for Novel Class Discovery
IEEE International Conference on Computer Vision (ICCV), 2023
Peiyan Gu
Chuyu Zhang
Rui Xu
Xuming He
272
30
0
18 Jul 2023
Cumulative Spatial Knowledge Distillation for Vision Transformers
IEEE International Conference on Computer Vision (ICCV), 2023
Borui Zhao
Renjie Song
Jiajun Liang
155
21
0
17 Jul 2023
DOT: A Distillation-Oriented Trainer
IEEE International Conference on Computer Vision (ICCV), 2023
Borui Zhao
Quan Cui
Renjie Song
Jiajun Liang
137
11
0
17 Jul 2023
Regression-Oriented Knowledge Distillation for Lightweight Ship Orientation Angle Prediction with Optical Remote Sensing Images
Signal, Image and Video Processing (SIVP), 2023
Zhan Shi
Xin Ding
Peng Ding
Chun Yang
Ruihong Huang
Xiao-dong Song
107
1
0
13 Jul 2023
The Staged Knowledge Distillation in Video Classification: Harmonizing Student Progress by a Complementary Weakly Supervised Framework
Chao Wang
Zhenghang Tang
248
4
0
11 Jul 2023
Customizing Synthetic Data for Data-Free Student Learning
IEEE International Conference on Multimedia and Expo (ICME), 2023
Shiya Luo
Defang Chen
Can Wang
79
2
0
10 Jul 2023
Distilling Large Vision-Language Model with Out-of-Distribution Generalizability
IEEE International Conference on Computer Vision (ICCV), 2023
Xuanlin Li
Yunhao Fang
Minghua Liu
Z. Ling
Zhuowen Tu
Haoran Su
VLM
287
40
0
06 Jul 2023
Review helps learn better: Temporal Supervised Knowledge Distillation
Dongwei Wang
Zhi Han
Yanmei Wang
Xi’ai Chen
Baichen Liu
Yandong Tang
298
1
0
03 Jul 2023
A Dimensional Structure based Knowledge Distillation Method for Cross-Modal Learning
Hui Xiong
Hongwei Dong
Jingyao Wang
J. Yu
Wen-jie Zhai
Changwen Zheng
Jianwei Niu
Gang Hua
165
1
0
28 Jun 2023
Hybrid Distillation: Connecting Masked Autoencoders with Contrastive Learners
International Conference on Learning Representations (ICLR), 2023
Bowen Shi
Xiaopeng Zhang
Yaoming Wang
Jin Li
Wenrui Dai
Junni Zou
H. Xiong
Qi Tian
240
8
0
28 Jun 2023
Enhancing Mapless Trajectory Prediction through Knowledge Distillation
Yuning Wang
Pu Zhang
Mengwei He
Jianru Xue
132
7
0
25 Jun 2023
NetBooster: Empowering Tiny Deep Learning By Standing on the Shoulders of Deep Giants
Design Automation Conference (DAC), 2023
Zhongzhi Yu
Y. Fu
Jiayi Yuan
Haoran You
Yingyan Lin
189
2
0
23 Jun 2023
Categories of Response-Based, Feature-Based, and Relation-Based Knowledge Distillation
Chuanguang Yang
Xinqiang Yu
Zhulin An
Yongjun Xu
VLM
OffRL
476
33
0
19 Jun 2023
Enhanced Multimodal Representation Learning with Cross-modal KD
Computer Vision and Pattern Recognition (CVPR), 2023
Mengxi Chen
Linyu Xing
Yu Wang
Ya Zhang
129
17
0
13 Jun 2023
Adaptive Multi-Teacher Knowledge Distillation with Meta-Learning
IEEE International Conference on Multimedia and Expo (ICME), 2023
Hailin Zhang
Defang Chen
Can Wang
177
27
0
11 Jun 2023
CALICO: Self-Supervised Camera-LiDAR Contrastive Pre-training for BEV Perception
International Conference on Learning Representations (ICLR), 2023
Jiachen Sun
Haizhong Zheng
Qingzhao Zhang
Atul Prakash
Z. Morley Mao
Chaowei Xiao
SSL
216
12
0
01 Jun 2023
Improving Knowledge Distillation via Regularizing Feature Norm and Direction
Yuzhu Wang
Lechao Cheng
Manni Duan
Yongheng Wang
Zunlei Feng
Shu Kong
178
28
0
26 May 2023
Three Towers: Flexible Contrastive Learning with Pretrained Image Models
Neural Information Processing Systems (NeurIPS), 2023
Jannik Kossen
Mark Collier
Basil Mustafa
Tianlin Li
Xiaohua Zhai
Lucas Beyer
Andreas Steiner
Jesse Berent
Rodolphe Jenatton
Efi Kokiopoulou
VLM
173
18
0
26 May 2023
Previous
1
2
3
4
5
6
...
12
13
14
Next