ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1910.10699
  4. Cited By
Contrastive Representation Distillation
v1v2v3 (latest)

Contrastive Representation Distillation

International Conference on Learning Representations (ICLR), 2019
23 October 2019
Yonglong Tian
Dilip Krishnan
Phillip Isola
ArXiv (abs)PDFHTMLGithub (2336★)

Papers citing "Contrastive Representation Distillation"

50 / 686 papers shown
Title
Contrastive Attraction and Contrastive Repulsion for Representation
  Learning
Contrastive Attraction and Contrastive Repulsion for Representation Learning
Huangjie Zheng
Xu Chen
Jiangchao Yao
Hongxia Yang
Chunyuan Li
Ya Zhang
Hao Zhang
Ivor Tsang
Jingren Zhou
Mingyuan Zhou
SSL
200
13
0
08 May 2021
Initialization and Regularization of Factorized Neural Layers
Initialization and Regularization of Factorized Neural LayersInternational Conference on Learning Representations (ICLR), 2021
M. Khodak
Neil A. Tenenholtz
Lester W. Mackey
Nicolò Fusi
342
66
0
03 May 2021
Spirit Distillation: A Model Compression Method with Multi-domain
  Knowledge Transfer
Spirit Distillation: A Model Compression Method with Multi-domain Knowledge TransferKnowledge Science, Engineering and Management (KSEM), 2021
Zhiyuan Wu
Yu-Gang Jiang
Minghao Zhao
Chupeng Cui
Zongmin Yang
Xinhui Xue
Hong Qi
VLM
140
10
0
29 Apr 2021
LIDAR and Position-Aided mmWave Beam Selection with Non-local CNNs and
  Curriculum Training
LIDAR and Position-Aided mmWave Beam Selection with Non-local CNNs and Curriculum TrainingIEEE Transactions on Vehicular Technology (IEEE Trans. Veh. Technol.), 2021
Matteo Zecchin
Mahdi Boloursaz Mashhadi
Mikolaj Jankowski
Deniz Gunduz
Marios Kountouris
David Gesbert
162
69
0
29 Apr 2021
Self-distillation with Batch Knowledge Ensembling Improves ImageNet
  Classification
Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification
Yixiao Ge
Xiao Zhang
Ching Lam Choi
Ka Chun Cheung
Peipei Zhao
Feng Zhu
Xiaogang Wang
Rui Zhao
Jiaming Song
FedMLUQCV
298
36
0
27 Apr 2021
Distilling Audio-Visual Knowledge by Compositional Contrastive Learning
Distilling Audio-Visual Knowledge by Compositional Contrastive LearningComputer Vision and Pattern Recognition (CVPR), 2021
Yanbei Chen
Yongqin Xian
A. Sophia Koepke
Ying Shan
Zeynep Akata
228
95
0
22 Apr 2021
Voice2Mesh: Cross-Modal 3D Face Model Generation from Voices
Voice2Mesh: Cross-Modal 3D Face Model Generation from Voices
Cho-Ying Wu
Ke Xu
Chin-Cheng Hsu
Ulrich Neumann
CVBM3DH
122
5
0
21 Apr 2021
DisCo: Remedy Self-supervised Learning on Lightweight Models with
  Distilled Contrastive Learning
DisCo: Remedy Self-supervised Learning on Lightweight Models with Distilled Contrastive Learning
Yuting Gao
Jia-Xin Zhuang
Xiaowei Guo
Hao Cheng
Xing Sun
Ke Li
Feiyue Huang
348
44
0
19 Apr 2021
Distilling Knowledge via Knowledge Review
Distilling Knowledge via Knowledge ReviewComputer Vision and Pattern Recognition (CVPR), 2021
Pengguang Chen
Shu Liu
Hengshuang Zhao
Jiaya Jia
354
568
0
19 Apr 2021
Vision Transformer Pruning
Vision Transformer Pruning
Mingjian Zhu
Yehui Tang
Kai Han
ViT
400
108
0
17 Apr 2021
Learning from 2D: Contrastive Pixel-to-Point Knowledge Transfer for 3D
  Pretraining
Learning from 2D: Contrastive Pixel-to-Point Knowledge Transfer for 3D Pretraining
Yueh-Cheng Liu
Yu-Kai Huang
HungYueh Chiang
Hung-Ting Su
Zhe-Yu Liu
Chin-Tang Chen
Ching-Yu Tseng
Winston H. Hsu
3DPC
123
35
0
10 Apr 2021
MRI-based Alzheimer's disease prediction via distilling the knowledge in
  multi-modal data
MRI-based Alzheimer's disease prediction via distilling the knowledge in multi-modal dataNeuroImage (NeuroImage), 2021
Hao Guan
Chaoyue Wang
Dacheng Tao
145
39
0
08 Apr 2021
Distilling and Transferring Knowledge via cGAN-generated Samples for
  Image Classification and Regression
Distilling and Transferring Knowledge via cGAN-generated Samples for Image Classification and RegressionExpert systems with applications (ESWA), 2021
Xin Ding
Z. J. Wang
Zuheng Xu
Z. Jane Wang
William J. Welch
277
23
0
07 Apr 2021
Contrastive Syn-to-Real Generalization
Contrastive Syn-to-Real GeneralizationInternational Conference on Learning Representations (ICLR), 2021
Wuyang Chen
Zhiding Yu
Shalini De Mello
Sifei Liu
J. Álvarez
Zinan Lin
Anima Anandkumar
166
49
0
06 Apr 2021
Graph Contrastive Clustering
Graph Contrastive ClusteringIEEE International Conference on Computer Vision (ICCV), 2021
Huasong Zhong
Yue Yu
Chong Chen
Jianqiang Huang
Minghua Deng
Liqiang Nie
Zhouchen Lin
Xiansheng Hua
138
147
0
03 Apr 2021
A Realistic Evaluation of Semi-Supervised Learning for Fine-Grained
  Classification
A Realistic Evaluation of Semi-Supervised Learning for Fine-Grained ClassificationComputer Vision and Pattern Recognition (CVPR), 2021
Jong-Chyi Su
Zezhou Cheng
Subhransu Maji
194
62
0
01 Apr 2021
Unsupervised Domain Expansion for Visual Categorization
Unsupervised Domain Expansion for Visual Categorization
Jie Wang
Kaibin Tian
Dayong Ding
Gang Yang
Xirong Li
141
9
0
01 Apr 2021
Knowledge Distillation By Sparse Representation Matching
Knowledge Distillation By Sparse Representation Matching
D. Tran
Moncef Gabbouj
Alexandros Iosifidis
141
0
0
31 Mar 2021
Complementary Relation Contrastive Distillation
Complementary Relation Contrastive DistillationComputer Vision and Pattern Recognition (CVPR), 2021
Jinguo Zhu
Weizhen He
Dapeng Chen
Shijie Yu
Yakun Liu
A. Yang
M. Rong
Xiaohua Wang
153
90
0
29 Mar 2021
Embedding Transfer with Label Relaxation for Improved Metric Learning
Embedding Transfer with Label Relaxation for Improved Metric LearningComputer Vision and Pattern Recognition (CVPR), 2021
Sungyeon Kim
Dongwon Kim
Minsu Cho
Suha Kwak
155
41
0
27 Mar 2021
Deep Ensemble Collaborative Learning by using Knowledge-transfer Graph
  for Fine-grained Object Classification
Deep Ensemble Collaborative Learning by using Knowledge-transfer Graph for Fine-grained Object Classification
Naoki Okamoto
Soma Minami
Tsubasa Hirakawa
Takayoshi Yamashita
H. Fujiyoshi
FedML
95
2
0
27 Mar 2021
Distilling a Powerful Student Model via Online Knowledge Distillation
Distilling a Powerful Student Model via Online Knowledge DistillationIEEE Transactions on Neural Networks and Learning Systems (TNNLS), 2021
Shaojie Li
Mingbao Lin
Yan Wang
Yongjian Wu
Yonghong Tian
Ling Shao
Rongrong Ji
FedML
222
60
0
26 Mar 2021
Universal Representation Learning from Multiple Domains for Few-shot
  Classification
Universal Representation Learning from Multiple Domains for Few-shot ClassificationIEEE International Conference on Computer Vision (ICCV), 2021
Weihong Li
Xialei Liu
Hakan Bilen
SSLOODVLM
185
108
0
25 Mar 2021
Student Network Learning via Evolutionary Knowledge Distillation
Student Network Learning via Evolutionary Knowledge Distillation
Kangkai Zhang
Chunhui Zhang
Shikun Li
Dan Zeng
Shiming Ge
147
99
0
23 Mar 2021
Compacting Deep Neural Networks for Internet of Things: Methods and
  Applications
Compacting Deep Neural Networks for Internet of Things: Methods and ApplicationsIEEE Internet of Things Journal (IEEE IoT Journal), 2021
Ke Zhang
Hanbo Ying
Hongning Dai
Lin Li
Yuangyuang Peng
Keyi Guo
Hongfang Yu
218
44
0
20 Mar 2021
Refine Myself by Teaching Myself: Feature Refinement via Self-Knowledge
  Distillation
Refine Myself by Teaching Myself: Feature Refinement via Self-Knowledge DistillationComputer Vision and Pattern Recognition (CVPR), 2021
Mingi Ji
Seungjae Shin
Seunghyun Hwang
Gibeom Park
Il-Chul Moon
139
155
0
15 Mar 2021
BreakingBED -- Breaking Binary and Efficient Deep Neural Networks by
  Adversarial Attacks
BreakingBED -- Breaking Binary and Efficient Deep Neural Networks by Adversarial Attacks
M. Vemparala
Alexander Frickenstein
Nael Fasfous
Lukas Frickenstein
Qi Zhao
...
Daniel Ehrhardt
Yuankai Wu
C. Unger
N. S. Nagaraja
W. Stechele
AAML
99
0
0
14 Mar 2021
A New Training Framework for Deep Neural Network
Zhenyan Hou
Wenxuan Fan
FedML
255
2
0
12 Mar 2021
SMIL: Multimodal Learning with Severely Missing Modality
SMIL: Multimodal Learning with Severely Missing ModalityAAAI Conference on Artificial Intelligence (AAAI), 2021
Mengmeng Ma
Jian Ren
Long Zhao
Sergey Tulyakov
Cathy H. Wu
Xi Peng
209
334
0
09 Mar 2021
There is More than Meets the Eye: Self-Supervised Multi-Object Detection
  and Tracking with Sound by Distilling Multimodal Knowledge
There is More than Meets the Eye: Self-Supervised Multi-Object Detection and Tracking with Sound by Distilling Multimodal KnowledgeComputer Vision and Pattern Recognition (CVPR), 2021
Francisco Rivera Valverde
Juana Valeria Hurtado
Abhinav Valada
179
81
0
01 Mar 2021
Distilling Knowledge via Intermediate Classifiers
Distilling Knowledge via Intermediate Classifiers
Aryan Asadian
Amirali Salehi-Abari
127
1
0
28 Feb 2021
PURSUhInT: In Search of Informative Hint Points Based on Layer
  Clustering for Knowledge Distillation
PURSUhInT: In Search of Informative Hint Points Based on Layer Clustering for Knowledge DistillationExpert systems with applications (ESWA), 2021
Reyhan Kevser Keser
Aydin Ayanzadeh
O. A. Aghdam
Çaglar Kilcioglu
B. U. Toreyin
N. K. Üre
278
7
0
26 Feb 2021
Even your Teacher Needs Guidance: Ground-Truth Targets Dampen
  Regularization Imposed by Self-Distillation
Even your Teacher Needs Guidance: Ground-Truth Targets Dampen Regularization Imposed by Self-DistillationNeural Information Processing Systems (NeurIPS), 2021
Kenneth Borup
L. Andersen
160
16
0
25 Feb 2021
Semantically-Conditioned Negative Samples for Efficient Contrastive
  Learning
Semantically-Conditioned Negative Samples for Efficient Contrastive Learning
J. Ó. Neill
Danushka Bollegala
135
6
0
12 Feb 2021
Learning Student-Friendly Teacher Networks for Knowledge Distillation
Learning Student-Friendly Teacher Networks for Knowledge DistillationNeural Information Processing Systems (NeurIPS), 2021
D. Park
Moonsu Cha
C. Jeong
Daesin Kim
Bohyung Han
401
115
0
12 Feb 2021
Show, Attend and Distill:Knowledge Distillation via Attention-based
  Feature Matching
Show, Attend and Distill:Knowledge Distillation via Attention-based Feature MatchingAAAI Conference on Artificial Intelligence (AAAI), 2021
Mingi Ji
Byeongho Heo
Sungrae Park
216
174
0
05 Feb 2021
Rethinking Soft Labels for Knowledge Distillation: A Bias-Variance
  Tradeoff Perspective
Rethinking Soft Labels for Knowledge Distillation: A Bias-Variance Tradeoff PerspectiveInternational Conference on Learning Representations (ICLR), 2021
Helong Zhou
Liangchen Song
Jiajie Chen
Ye Zhou
Guoli Wang
Junsong Yuan
Qian Zhang
315
199
0
01 Feb 2021
Re-labeling ImageNet: from Single to Multi-Labels, from Global to
  Localized Labels
Re-labeling ImageNet: from Single to Multi-Labels, from Global to Localized LabelsComputer Vision and Pattern Recognition (CVPR), 2021
Sangdoo Yun
Seong Joon Oh
Byeongho Heo
Dongyoon Han
Junsuk Choe
Sanghyuk Chun
965
163
0
13 Jan 2021
SEED: Self-supervised Distillation For Visual Representation
SEED: Self-supervised Distillation For Visual RepresentationInternational Conference on Learning Representations (ICLR), 2021
Zhiyuan Fang
Jianfeng Wang
Lijuan Wang
Lei Zhang
Yezhou Yang
Zicheng Liu
SSL
461
208
0
12 Jan 2021
Knowledge Distillation in Iterative Generative Models for Improved
  Sampling Speed
Knowledge Distillation in Iterative Generative Models for Improved Sampling Speed
Eric Luhman
Troy Luhman
DiffM
424
339
0
07 Jan 2021
MSD: Saliency-aware Knowledge Distillation for Multimodal Understanding
MSD: Saliency-aware Knowledge Distillation for Multimodal UnderstandingConference on Empirical Methods in Natural Language Processing (EMNLP), 2021
Woojeong Jin
Maziar Sanjabi
Shaoliang Nie
L Tan
Xiang Ren
Hamed Firooz
127
6
0
06 Jan 2021
AttentionLite: Towards Efficient Self-Attention Models for Vision
AttentionLite: Towards Efficient Self-Attention Models for VisionIEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), 2020
Souvik Kundu
Sairam Sundaresan
134
22
0
21 Dec 2020
Diverse Knowledge Distillation for End-to-End Person Search
Diverse Knowledge Distillation for End-to-End Person SearchAAAI Conference on Artificial Intelligence (AAAI), 2020
Xinyu Zhang
Xinlong Wang
Jiawang Bian
Chunhua Shen
Mingyu You
FedML
175
38
0
21 Dec 2020
Computation-Efficient Knowledge Distillation via Uncertainty-Aware Mixup
Computation-Efficient Knowledge Distillation via Uncertainty-Aware MixupPattern Recognition (Pattern Recognit.), 2020
Guodong Xu
Ziwei Liu
Chen Change Loy
UQCV
186
42
0
17 Dec 2020
ISD: Self-Supervised Learning by Iterative Similarity Distillation
ISD: Self-Supervised Learning by Iterative Similarity DistillationIEEE International Conference on Computer Vision (ICCV), 2020
Ajinkya Tejankar
Soroush Abbasi Koohpayegani
Vipin Pillai
Paolo Favaro
Hamed Pirsiavash
SSL
273
46
0
16 Dec 2020
Wasserstein Contrastive Representation Distillation
Wasserstein Contrastive Representation DistillationComputer Vision and Pattern Recognition (CVPR), 2020
Liqun Chen
Dong Wang
Zhe Gan
Jingjing Liu
Ricardo Henao
Lawrence Carin
157
107
0
15 Dec 2020
LRC-BERT: Latent-representation Contrastive Knowledge Distillation for
  Natural Language Understanding
LRC-BERT: Latent-representation Contrastive Knowledge Distillation for Natural Language UnderstandingAAAI Conference on Artificial Intelligence (AAAI), 2020
Hao Fu
Shaojun Zhou
Qihong Yang
Junjie Tang
Guiquan Liu
Kaikui Liu
Xiaolong Li
277
65
0
14 Dec 2020
Model Compression Using Optimal Transport
Model Compression Using Optimal Transport
Suhas Lohit
Michael J. Jones
181
9
0
07 Dec 2020
Cross-Layer Distillation with Semantic Calibration
Cross-Layer Distillation with Semantic CalibrationAAAI Conference on Artificial Intelligence (AAAI), 2020
Defang Chen
Jian-Ping Mei
Yuan Zhang
Can Wang
Yan Feng
Chun-Yen Chen
FedML
250
347
0
06 Dec 2020
Multi-head Knowledge Distillation for Model Compression
Multi-head Knowledge Distillation for Model Compression
Haiquan Wang
Suhas Lohit
Michael J. Jones
Y. Fu
95
5
0
05 Dec 2020
Previous
123...11121314
Next