ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1910.10699
  4. Cited By
Contrastive Representation Distillation

Contrastive Representation Distillation

23 October 2019
Yonglong Tian
Dilip Krishnan
Phillip Isola
ArXivPDFHTML

Papers citing "Contrastive Representation Distillation"

50 / 611 papers shown
Title
ABKD: Pursuing a Proper Allocation of the Probability Mass in Knowledge Distillation via $α$-$β$-Divergence
ABKD: Pursuing a Proper Allocation of the Probability Mass in Knowledge Distillation via ααα-βββ-Divergence
Guanghui Wang
Zhiyong Yang
Z. Wang
Shi Wang
Qianqian Xu
Q. Huang
39
0
0
07 May 2025
Swapped Logit Distillation via Bi-level Teacher Alignment
Swapped Logit Distillation via Bi-level Teacher Alignment
Stephen Ekaputra Limantoro
Jhe-Hao Lin
Chih-Yu Wang
Yi-Lung Tsai
Hong-Han Shuai
Ching-Chun Huang
Wen-Huang Cheng
49
0
0
27 Apr 2025
CMCRD: Cross-Modal Contrastive Representation Distillation for Emotion Recognition
CMCRD: Cross-Modal Contrastive Representation Distillation for Emotion Recognition
Siyuan Kan
Huanyu Wu
Zhenyao Cui
Fan Huang
Xiaolong Xu
Dongrui Wu
34
0
0
12 Apr 2025
An Efficient Training Algorithm for Models with Block-wise Sparsity
An Efficient Training Algorithm for Models with Block-wise Sparsity
Ding Zhu
Zhiqun Zuo
Mohammad Mahdi Khalili
37
0
0
27 Mar 2025
Sparse Logit Sampling: Accelerating Knowledge Distillation in LLMs
Sparse Logit Sampling: Accelerating Knowledge Distillation in LLMs
Anshumann
Mohd Abbas Zaidi
Akhil Kedia
Jinwoo Ahn
Taehwak Kwon
Kangwook Lee
Haejun Lee
Joohyung Lee
FedML
134
0
0
21 Mar 2025
Cyclic Contrastive Knowledge Transfer for Open-Vocabulary Object Detection
Cyclic Contrastive Knowledge Transfer for Open-Vocabulary Object Detection
Chuhan Zhang
Chaoyang Zhu
Pingcheng Dong
Long Chen
Dong Zhang
ObjD
VLM
129
0
0
14 Mar 2025
CalliReader: Contextualizing Chinese Calligraphy via an Embedding-Aligned Vision-Language Model
Yuxuan Luo
Jiaqi Tang
Chenyi Huang
Feiyang Hao
Zhouhui Lian
VLM
56
0
0
13 Mar 2025
Asymmetric Decision-Making in Online Knowledge Distillation:Unifying Consensus and Divergence
Zhaowei Chen
Borui Zhao
Yuchen Ge
Yuhao Chen
Renjie Song
Jiajun Liang
44
0
0
09 Mar 2025
AugFL: Augmenting Federated Learning with Pretrained Models
Sheng Yue
Zerui Qin
Yongheng Deng
Ju Ren
Yaoxue Zhang
Junshan Zhang
FedML
80
0
0
04 Mar 2025
Multi-Level Decoupled Relational Distillation for Heterogeneous Architectures
Yaoxin Yang
Peng Ye
Weihao Lin
Kangcong Li
Yan Wen
Jia Hao
Tao Chen
33
0
0
10 Feb 2025
Contrastive Representation Distillation via Multi-Scale Feature Decoupling
Contrastive Representation Distillation via Multi-Scale Feature Decoupling
Cuipeng Wang
Tieyuan Chen
Haipeng Wang
49
0
0
09 Feb 2025
Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective
Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective
Jinjing Zhu
Songze Li
Lin Wang
44
0
0
13 Jan 2025
Knowledge Distillation with Adapted Weight
Sirong Wu
Xi Luo
Junjie Liu
Yuhui Deng
35
0
0
06 Jan 2025
Predicting the Reliability of an Image Classifier under Image Distortion
Predicting the Reliability of an Image Classifier under Image Distortion
D. Nguyen
Sunil Gupta
Kien Do
Svetha Venkatesh
VLM
68
0
0
22 Dec 2024
Cross-View Consistency Regularisation for Knowledge Distillation
Cross-View Consistency Regularisation for Knowledge Distillation
W. Zhang
Dongnan Liu
Weidong Cai
Chao Ma
68
1
0
21 Dec 2024
Neural Collapse Inspired Knowledge Distillation
Neural Collapse Inspired Knowledge Distillation
Shuoxi Zhang
Zijian Song
Kun He
67
1
0
16 Dec 2024
Wasserstein Distance Rivals Kullback-Leibler Divergence for Knowledge
  Distillation
Wasserstein Distance Rivals Kullback-Leibler Divergence for Knowledge Distillation
Jiaming Lv
Haoyuan Yang
P. Li
74
1
0
11 Dec 2024
Map-Free Trajectory Prediction with Map Distillation and Hierarchical Encoding
Xiaodong Liu
Yucheng Xing
Xin Wang
26
0
0
17 Nov 2024
Dual-Head Knowledge Distillation: Enhancing Logits Utilization with an
  Auxiliary Head
Dual-Head Knowledge Distillation: Enhancing Logits Utilization with an Auxiliary Head
Penghui Yang
Chen-Chen Zong
Sheng-Jun Huang
Lei Feng
Bo An
25
1
0
13 Nov 2024
Quantifying Knowledge Distillation Using Partial Information Decomposition
Quantifying Knowledge Distillation Using Partial Information Decomposition
Pasan Dissanayake
Faisal Hamman
Barproda Halder
Ilia Sucholutsky
Qiuyi Zhang
Sanghamitra Dutta
36
0
0
12 Nov 2024
Over-parameterized Student Model via Tensor Decomposition Boosted
  Knowledge Distillation
Over-parameterized Student Model via Tensor Decomposition Boosted Knowledge Distillation
Yu-Liang Zhan
Zhong-Yi Lu
Hao-Lun Sun
Ze-Feng Gao
23
0
0
10 Nov 2024
GazeGen: Gaze-Driven User Interaction for Visual Content Generation
GazeGen: Gaze-Driven User Interaction for Visual Content Generation
He-Yen Hsieh
Ziyun Li
Sai Qian Zhang
W. Ting
Kao-Den Chang
B. D. Salvo
Chiao Liu
H. T. Kung
VGen
30
0
0
07 Nov 2024
Toward Robust Incomplete Multimodal Sentiment Analysis via Hierarchical
  Representation Learning
Toward Robust Incomplete Multimodal Sentiment Analysis via Hierarchical Representation Learning
M. Li
Dingkang Yang
Y. Liu
Shunli Wang
Jiawei Chen
...
Xiaolu Hou
Mingyang Sun
Ziyun Qian
Dongliang Kou
L. Zhang
37
1
0
05 Nov 2024
Decoupling Dark Knowledge via Block-wise Logit Distillation for
  Feature-level Alignment
Decoupling Dark Knowledge via Block-wise Logit Distillation for Feature-level Alignment
Chengting Yu
Fengzhao Zhang
Ruizhe Chen
Zuozhu Liu
Shurun Tan
Er-ping Li
Aili Wang
36
2
0
03 Nov 2024
Pre-training Distillation for Large Language Models: A Design Space
  Exploration
Pre-training Distillation for Large Language Models: A Design Space Exploration
Hao Peng
Xin Lv
Yushi Bai
Zijun Yao
J. Zhang
Lei Hou
Juanzi Li
31
3
0
21 Oct 2024
Preview-based Category Contrastive Learning for Knowledge Distillation
Preview-based Category Contrastive Learning for Knowledge Distillation
Muhe Ding
Jianlong Wu
Xue Dong
Xiaojie Li
Pengda Qin
Tian Gan
Liqiang Nie
VLM
19
0
0
18 Oct 2024
CAKD: A Correlation-Aware Knowledge Distillation Framework Based on
  Decoupling Kullback-Leibler Divergence
CAKD: A Correlation-Aware Knowledge Distillation Framework Based on Decoupling Kullback-Leibler Divergence
Zao Zhang
Huaming Chen
Pei Ning
Nan Yang
Dong Yuan
24
1
0
17 Oct 2024
Composing Novel Classes: A Concept-Driven Approach to Generalized Category Discovery
Composing Novel Classes: A Concept-Driven Approach to Generalized Category Discovery
Chuyu Zhang
Peiyan Gu
Xueyang Yu
Xuming He
23
0
0
17 Oct 2024
TAS: Distilling Arbitrary Teacher and Student via a Hybrid Assistant
TAS: Distilling Arbitrary Teacher and Student via a Hybrid Assistant
Guopeng Li
Qiang Wang
K. Yan
Shouhong Ding
Yuan Gao
Gui-Song Xia
26
0
0
16 Oct 2024
HASN: Hybrid Attention Separable Network for Efficient Image
  Super-resolution
HASN: Hybrid Attention Separable Network for Efficient Image Super-resolution
Weifeng Cao
Xiaoyan Lei
Jun Shi
Wanyong Liang
Jie Liu
Zongfei Bai
SupR
24
0
0
13 Oct 2024
Efficient and Robust Knowledge Distillation from A Stronger Teacher
  Based on Correlation Matching
Efficient and Robust Knowledge Distillation from A Stronger Teacher Based on Correlation Matching
Wenqi Niu
Yingchao Wang
Guohui Cai
Hanpo Hou
19
0
0
09 Oct 2024
JPEG Inspired Deep Learning
JPEG Inspired Deep Learning
Ahmed H. Salamah
Kaixiang Zheng
Yiwen Liu
E. Yang
27
0
0
09 Oct 2024
Gap Preserving Distillation by Building Bidirectional Mappings with A
  Dynamic Teacher
Gap Preserving Distillation by Building Bidirectional Mappings with A Dynamic Teacher
Yong Guo
Shulian Zhang
Haolin Pan
Jing Liu
Yulun Zhang
Jian Chen
33
0
0
05 Oct 2024
Linear Projections of Teacher Embeddings for Few-Class Distillation
Linear Projections of Teacher Embeddings for Few-Class Distillation
Noel Loo
Fotis Iliopoulos
Wei Hu
Erik Vee
25
0
0
30 Sep 2024
Classroom-Inspired Multi-Mentor Distillation with Adaptive Learning Strategies
Classroom-Inspired Multi-Mentor Distillation with Adaptive Learning Strategies
Shalini Sarode
Muhammad Saif Ullah Khan
Tahira Shehzadi
Didier Stricker
Muhammad Zeshan Afzal
36
0
0
30 Sep 2024
IDEA: An Inverse Domain Expert Adaptation Based Active DNN IP Protection
  Method
IDEA: An Inverse Domain Expert Adaptation Based Active DNN IP Protection Method
Chaohui Xu
Qi Cui
Jinxin Dong
Weiyang He
Chip-Hong Chang
AAML
25
2
0
29 Sep 2024
Harmonizing knowledge Transfer in Neural Network with Unified
  Distillation
Harmonizing knowledge Transfer in Neural Network with Unified Distillation
Yaomin Huang
Zaomin Yan
Chaomin Shen
Faming Fang
Guixu Zhang
24
0
0
27 Sep 2024
Kendall's $τ$ Coefficient for Logits Distillation
Kendall's τττ Coefficient for Logits Distillation
Yuchen Guan
Runxi Cheng
Kang Liu
Chun Yuan
21
0
0
26 Sep 2024
Cascade Prompt Learning for Vision-Language Model Adaptation
Cascade Prompt Learning for Vision-Language Model Adaptation
Ge Wu
Xin Zhang
Zheng Li
Zhaowei Chen
Jiajun Liang
Jian Yang
Xiang Li
VLM
22
7
0
26 Sep 2024
Shape-intensity knowledge distillation for robust medical image
  segmentation
Shape-intensity knowledge distillation for robust medical image segmentation
Wenhui Dong
Bo Du
Yongchao Xu
26
0
0
26 Sep 2024
Simple Unsupervised Knowledge Distillation With Space Similarity
Simple Unsupervised Knowledge Distillation With Space Similarity
Aditya Singh
Haohan Wang
29
1
0
20 Sep 2024
Exploring and Enhancing the Transfer of Distribution in Knowledge
  Distillation for Autoregressive Language Models
Exploring and Enhancing the Transfer of Distribution in Knowledge Distillation for Autoregressive Language Models
Jun Rao
Xuebo Liu
Zepeng Lin
Liang Ding
Jing Li
Dacheng Tao
Min Zhang
34
2
0
19 Sep 2024
Frequency-Guided Masking for Enhanced Vision Self-Supervised Learning
Frequency-Guided Masking for Enhanced Vision Self-Supervised Learning
Amin Karimi Monsefi
Mengxi Zhou
Nastaran Karimi Monsefi
Ser-Nam Lim
Wei-Lun Chao
R. Ramnath
36
1
0
16 Sep 2024
Integrated Multi-Level Knowledge Distillation for Enhanced Speaker
  Verification
Integrated Multi-Level Knowledge Distillation for Enhanced Speaker Verification
Wenhao Yang
Jianguo Wei
Wenhuan Lu
Xugang Lu
Lei Li
28
0
0
14 Sep 2024
Low-Resolution Object Recognition with Cross-Resolution Relational
  Contrastive Distillation
Low-Resolution Object Recognition with Cross-Resolution Relational Contrastive Distillation
Kangkai Zhang
Shiming Ge
Ruixin Shi
Dan Zeng
41
13
0
04 Sep 2024
Learning Privacy-Preserving Student Networks via
  Discriminative-Generative Distillation
Learning Privacy-Preserving Student Networks via Discriminative-Generative Distillation
Shiming Ge
Bochao Liu
Pengju Wang
Yong Li
Dan Zeng
FedML
37
9
0
04 Sep 2024
Low-Resolution Face Recognition via Adaptable Instance-Relation
  Distillation
Low-Resolution Face Recognition via Adaptable Instance-Relation Distillation
Ruixin Shi
Weijia Guo
Shiming Ge
CVBM
18
0
0
03 Sep 2024
Adaptive Explicit Knowledge Transfer for Knowledge Distillation
Adaptive Explicit Knowledge Transfer for Knowledge Distillation
H. Park
Jong-seok Lee
27
0
0
03 Sep 2024
Image-to-Lidar Relational Distillation for Autonomous Driving Data
Image-to-Lidar Relational Distillation for Autonomous Driving Data
Anas Mahmoud
Ali Harakeh
Steven Waslander
21
0
0
01 Sep 2024
PRG: Prompt-Based Distillation Without Annotation via Proxy Relational
  Graph
PRG: Prompt-Based Distillation Without Annotation via Proxy Relational Graph
Yijin Xu
Jialun Liu
Hualiang Wei
Wenhui Li
22
0
0
22 Aug 2024
1234...111213
Next