Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
All Papers
0 / 0 papers shown
Title
Home
Papers
1910.10699
Cited By
v1
v2
v3 (latest)
Contrastive Representation Distillation
International Conference on Learning Representations (ICLR), 2019
23 October 2019
Yonglong Tian
Dilip Krishnan
Phillip Isola
Re-assign community
ArXiv (abs)
PDF
HTML
Github (2336★)
Papers citing
"Contrastive Representation Distillation"
50 / 686 papers shown
Title
CAML: Collaborative Auxiliary Modality Learning for Multi-Agent Systems
Rui Liu
Yu-cui Shen
Peng Gao
Erfaun Noorani
Ming C. Lin
250
3
0
25 Feb 2025
Multi-Level Decoupled Relational Distillation for Heterogeneous Architectures
Yaoxin Yang
Peng Ye
Weihao Lin
Kangcong Li
Yan Wen
Jia Hao
Tao Chen
277
0
0
10 Feb 2025
Contrastive Representation Distillation via Multi-Scale Feature Decoupling
Cuipeng Wang
Tieyuan Chen
287
1
0
09 Feb 2025
Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective
Jinjing Zhu
Songze Li
Lin Wang
278
0
0
13 Jan 2025
Knowledge Distillation with Adapted Weight
Statistics (Berlin) (SB), 2025
Sirong Wu
Junjie Liu
Xi Luo
Yuhui Deng
318
1
0
06 Jan 2025
Predicting the Reliability of an Image Classifier under Image Distortion
D. Nguyen
Sunil Gupta
Kien Do
Svetha Venkatesh
VLM
284
0
0
22 Dec 2024
Cross-View Consistency Regularisation for Knowledge Distillation
ACM Multimedia (MM), 2024
W. Zhang
Dongnan Liu
Weidong Cai
Chao Ma
386
7
0
21 Dec 2024
Neural Collapse Inspired Knowledge Distillation
AAAI Conference on Artificial Intelligence (AAAI), 2024
Shuoxi Zhang
Zijian Song
Kun He
375
1
0
16 Dec 2024
Wasserstein Distance Rivals Kullback-Leibler Divergence for Knowledge Distillation
Neural Information Processing Systems (NeurIPS), 2024
Jiaming Lv
Haoyuan Yang
P. Li
337
16
0
11 Dec 2024
Visual-Word Tokenizer: Beyond Fixed Sets of Tokens in Vision Transformers
Leonidas Gee
Wing Yan Li
V. Sharmanska
Novi Quadrianto
ViT
593
0
0
23 Nov 2024
Map-Free Trajectory Prediction with Map Distillation and Hierarchical Encoding
Xiaodong Liu
Yucheng Xing
Xin Wang
247
2
0
17 Nov 2024
Dual-Head Knowledge Distillation: Enhancing Logits Utilization with an Auxiliary Head
Penghui Yang
Chen-Chen Zong
Sheng-Jun Huang
Lei Feng
Bo An
380
2
0
13 Nov 2024
Quantifying Knowledge Distillation Using Partial Information Decomposition
International Conference on Artificial Intelligence and Statistics (AISTATS), 2024
Pasan Dissanayake
Faisal Hamman
Barproda Halder
Ilia Sucholutsky
Qiuyi Zhang
Sanghamitra Dutta
263
6
0
12 Nov 2024
Over-parameterized Student Model via Tensor Decomposition Boosted Knowledge Distillation
Neural Information Processing Systems (NeurIPS), 2024
Yu-Liang Zhan
Zhong-Yi Lu
Hao Sun
Ze-Feng Gao
218
2
0
10 Nov 2024
GazeGen: Gaze-Driven User Interaction for Visual Content Generation
He-Yen Hsieh
Ziyun Li
Sai Qian Zhang
W. Ting
Kao-Den Chang
B. D. Salvo
Chiao Liu
H. T. Kung
VGen
226
1
0
07 Nov 2024
Toward Robust Incomplete Multimodal Sentiment Analysis via Hierarchical Representation Learning
Neural Information Processing Systems (NeurIPS), 2024
Mingxing Li
Jinjie Wei
Yongxu Liu
Shunli Wang
Jiawei Chen
...
Xiaolu Hou
Mingyang Sun
Ziyun Qian
Dongliang Kou
Li Zhang
290
11
0
05 Nov 2024
Transferable polychromatic optical encoder for neural networks
Nature Communications (Nat. Commun.), 2024
Minho Choi
Jinlin Xiang
A. Wirth-Singh
Seung-Hwan Baek
Eli Shlizerman
Anirudha Majumdar
154
6
0
05 Nov 2024
Decoupling Dark Knowledge via Block-wise Logit Distillation for Feature-level Alignment
IEEE Transactions on Artificial Intelligence (IEEE TAI), 2024
Chengting Yu
Fengzhao Zhang
Ruizhe Chen
Zuozhu Liu
Shurun Tan
Er-ping Li
Aili Wang
293
5
0
03 Nov 2024
Pre-training Distillation for Large Language Models: A Design Space Exploration
Annual Meeting of the Association for Computational Linguistics (ACL), 2024
Hao Peng
Xin Lv
Yushi Bai
Zijun Yao
Jing Zhang
Lei Hou
Juanzi Li
231
8
0
21 Oct 2024
Preview-based Category Contrastive Learning for Knowledge Distillation
Muhe Ding
Yue Yu
Xue Dong
Xiaojie Li
Pengda Qin
Tian Gan
Liqiang Nie
VLM
202
0
0
18 Oct 2024
CAKD: A Correlation-Aware Knowledge Distillation Framework Based on Decoupling Kullback-Leibler Divergence
Industrial Conference on Data Mining (IDM), 2024
Zao Zhang
Huaming Chen
Pei Ning
Nan Yang
Dong Yuan
194
1
0
17 Oct 2024
Composing Novel Classes: A Concept-Driven Approach to Generalized Category Discovery
Chuyu Zhang
Peiyan Gu
Xueyang Yu
Xuming He
478
1
0
17 Oct 2024
Fuse Before Transfer: Knowledge Fusion for Heterogeneous Distillation
Guopeng Li
Qiang Wang
K. Yan
Shouhong Ding
Yuan Gao
Gui-Song Xia
316
0
0
16 Oct 2024
HASN: Hybrid Attention Separable Network for Efficient Image Super-resolution
The Visual Computer (VC), 2024
Weifeng Cao
Xiaoyan Lei
Jun Shi
Wanyong Liang
Jie Liu
Zongfei Bai
SupR
223
4
0
13 Oct 2024
Efficient and Robust Knowledge Distillation from A Stronger Teacher Based on Correlation Matching
Wenqi Niu
Yingchao Wang
Guohui Cai
Hanpo Hou
137
2
0
09 Oct 2024
JPEG Inspired Deep Learning
International Conference on Learning Representations (ICLR), 2024
Ahmed H. Salamah
Kaixiang Zheng
Yiwen Liu
En-Hui Yang
261
2
0
09 Oct 2024
Gap Preserving Distillation by Building Bidirectional Mappings with A Dynamic Teacher
International Conference on Learning Representations (ICLR), 2024
Yong Guo
Shulian Zhang
Haolin Pan
Jing Liu
Yulun Zhang
Jian Chen
218
0
0
05 Oct 2024
Linear Projections of Teacher Embeddings for Few-Class Distillation
Noel Loo
Fotis Iliopoulos
Wei Hu
Erik Vee
221
0
0
30 Sep 2024
Classroom-Inspired Multi-Mentor Distillation with Adaptive Learning Strategies
Shalini Sarode
Muhammad Saif Ullah Khan
Tahira Shehzadi
Didier Stricker
Muhammad Zeshan Afzal
252
1
0
30 Sep 2024
IDEA: An Inverse Domain Expert Adaptation Based Active DNN IP Protection Method
Chaohui Xu
Qi Cui
Jinxin Dong
Weiyang He
Chip-Hong Chang
AAML
379
3
0
29 Sep 2024
Harmonizing knowledge Transfer in Neural Network with Unified Distillation
European Conference on Computer Vision (ECCV), 2024
Yaomin Huang
Zaomin Yan
Yaxin Peng
Faming Fang
Guixu Zhang
239
0
0
27 Sep 2024
Cascade Prompt Learning for Vision-Language Model Adaptation
European Conference on Computer Vision (ECCV), 2024
Ge Wu
Xin Zhang
Zheng Li
Zhaowei Chen
Jiajun Liang
Jian Yang
Xiang Li
VLM
272
22
0
26 Sep 2024
Shape-intensity knowledge distillation for robust medical image segmentation
Wenhui Dong
Bo Du
Yongchao Xu
212
5
0
26 Sep 2024
Enhancing Logits Distillation with Plug\&Play Kendall's
τ
τ
τ
Ranking Loss
Yuchen Guan
Runxi Cheng
Kang Liu
Chun Yuan
289
0
0
26 Sep 2024
Simple Unsupervised Knowledge Distillation With Space Similarity
European Conference on Computer Vision (ECCV), 2024
Aditya Singh
Haohan Wang
347
5
0
20 Sep 2024
Exploring and Enhancing the Transfer of Distribution in Knowledge Distillation for Autoregressive Language Models
Jun Rao
Xuebo Liu
Zepeng Lin
Liang Ding
Jing Li
Dacheng Tao
Min Zhang
287
2
0
19 Sep 2024
Frequency-Guided Masking for Enhanced Vision Self-Supervised Learning
International Conference on Learning Representations (ICLR), 2024
Amin Karimi Monsefi
Mengxi Zhou
Nastaran Karimi Monsefi
Ser-Nam Lim
Wei-Lun Chao
R. Ramnath
256
4
0
16 Sep 2024
Integrated Multi-Level Knowledge Distillation for Enhanced Speaker Verification
Wenhao Yang
Jianguo Wei
Wenhuan Lu
Xugang Lu
Lei Li
137
0
0
14 Sep 2024
Low-Resolution Object Recognition with Cross-Resolution Relational Contrastive Distillation
Kangkai Zhang
Shiming Ge
Ruixin Shi
Dan Zeng
189
20
0
04 Sep 2024
Learning Privacy-Preserving Student Networks via Discriminative-Generative Distillation
IEEE Transactions on Image Processing (IEEE TIP), 2022
Shiming Ge
Bochao Liu
Pengju Wang
Yong Li
Dan Zeng
FedML
227
20
0
04 Sep 2024
Low-Resolution Face Recognition via Adaptable Instance-Relation Distillation
IEEE International Joint Conference on Neural Network (IJCNN), 2024
Ruixin Shi
Weijia Guo
Shiming Ge
CVBM
218
0
0
03 Sep 2024
Adaptive Explicit Knowledge Transfer for Knowledge Distillation
H. Park
Jong-seok Lee
206
2
0
03 Sep 2024
Image-to-Lidar Relational Distillation for Autonomous Driving Data
European Conference on Computer Vision (ECCV), 2024
Anas Mahmoud
Ali Harakeh
Steven Waslander
169
4
0
01 Sep 2024
PRG: Prompt-Based Distillation Without Annotation via Proxy Relational Graph
Yijin Xu
Jialun Liu
Hualiang Wei
Wenhui Li
253
0
0
22 Aug 2024
LAKD-Activation Mapping Distillation Based on Local Learning
Yaoze Zhang
Yuming Zhang
Yu Zhao
Yue Zhang
Feiyu Zhu
214
1
0
21 Aug 2024
Focus on Focus: Focus-oriented Representation Learning and Multi-view Cross-modal Alignment for Glioma Grading
IEEE International Conference on Bioinformatics and Biomedicine (BIBM), 2024
Li Pan
Yupei Zhang
Qiushi Yang
Tan Li
Xiaohan Xing
Maximus C. F. Yeung
Zhen Chen
164
3
0
16 Aug 2024
Knowledge Distillation with Refined Logits
Wujie Sun
Defang Chen
Siwei Lyu
Genlang Chen
Chun-Yen Chen
Can Wang
267
4
0
14 Aug 2024
Depth Helps: Improving Pre-trained RGB-based Policy with Depth Information Injection
IEEE/RJS International Conference on Intelligent RObots and Systems (IROS), 2024
Xincheng Pang
Wenke Xia
Zhigang Wang
Bin Zhao
Di Hu
Dong Wang
Xuelong Li
217
8
0
09 Aug 2024
UNIC: Universal Classification Models via Multi-teacher Distillation
European Conference on Computer Vision (ECCV), 2024
Mert Bulent Sariyildiz
Philippe Weinzaepfel
Thomas Lucas
Diane Larlus
Yannis Kalantidis
311
16
0
09 Aug 2024
How to Train the Teacher Model for Effective Knowledge Distillation
Shayan Mohajer Hamidi
Xizhen Deng
Renhao Tan
Linfeng Ye
Ahmed H. Salamah
246
9
0
25 Jul 2024
Previous
1
2
3
4
5
...
12
13
14
Next