ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1910.10699
  4. Cited By
Contrastive Representation Distillation
v1v2v3 (latest)

Contrastive Representation Distillation

International Conference on Learning Representations (ICLR), 2019
23 October 2019
Yonglong Tian
Dilip Krishnan
Phillip Isola
ArXiv (abs)PDFHTMLGithub (2336★)

Papers citing "Contrastive Representation Distillation"

50 / 686 papers shown
Title
Measure Twice, Cut Once: Quantifying Bias and Fairness in Deep Neural
  Networks
Measure Twice, Cut Once: Quantifying Bias and Fairness in Deep Neural Networks
Cody Blakeney
G. Atkinson
Nathaniel Huish
Yan Yan
V. Metsis
Ziliang Zong
86
3
0
08 Oct 2021
3D Infomax improves GNNs for Molecular Property Prediction
3D Infomax improves GNNs for Molecular Property PredictionInternational Conference on Machine Learning (ICML), 2021
Hannes Stärk
Dominique Beaini
Gabriele Corso
Prudencio Tossou
Christian Dallago
Stephan Günnemann
Pietro Lio
AI4CE
246
247
0
08 Oct 2021
KNOT: Knowledge Distillation using Optimal Transport for Solving NLP
  Tasks
KNOT: Knowledge Distillation using Optimal Transport for Solving NLP Tasks
Rishabh Bhardwaj
Tushar Vaidya
Soujanya Poria
OTFedML
304
10
0
06 Oct 2021
Deep Neural Compression Via Concurrent Pruning and Self-Distillation
Deep Neural Compression Via Concurrent Pruning and Self-Distillation
J. Ó. Neill
Sourav Dutta
H. Assem
VLM
107
5
0
30 Sep 2021
Prune Your Model Before Distill It
Prune Your Model Before Distill It
Jinhyuk Park
Albert No
VLM
320
37
0
30 Sep 2021
Towards Communication-Efficient and Privacy-Preserving Federated
  Representation Learning
Towards Communication-Efficient and Privacy-Preserving Federated Representation Learning
Haizhou Shi
Youcai Zhang
Zijin Shen
Siliang Tang
Yaqian Li
Yandong Guo
Yueting Zhuang
86
7
0
29 Sep 2021
Partial to Whole Knowledge Distillation: Progressive Distilling
  Decomposed Knowledge Boosts Student Better
Partial to Whole Knowledge Distillation: Progressive Distilling Decomposed Knowledge Boosts Student Better
Xuanyang Zhang
Xinming Zhang
Jian Sun
135
1
0
26 Sep 2021
RAIL-KD: RAndom Intermediate Layer Mapping for Knowledge Distillation
RAIL-KD: RAndom Intermediate Layer Mapping for Knowledge Distillation
Md. Akmal Haidar
Nithin Anchuri
Mehdi Rezagholizadeh
Abbas Ghaddar
Philippe Langlais
Pascal Poupart
263
26
0
21 Sep 2021
ConvFiT: Conversational Fine-Tuning of Pretrained Language Models
ConvFiT: Conversational Fine-Tuning of Pretrained Language Models
Ivan Vulić
Pei-hao Su
Sam Coope
D. Gerz
Paweł Budzianowski
I. Casanueva
Nikola Mrkvsić
Tsung-Hsien Wen
230
39
0
21 Sep 2021
Label Assignment Distillation for Object Detection
Hailun Zhang
76
2
0
16 Sep 2021
Multihop: Leveraging Complex Models to Learn Accurate Simple Models
Multihop: Leveraging Complex Models to Learn Accurate Simple Models
Amit Dhurandhar
Tejaswini Pedapati
155
0
0
14 Sep 2021
Multi-Scale Aligned Distillation for Low-Resolution Detection
Multi-Scale Aligned Distillation for Low-Resolution Detection
Lu Qi
Jason Kuen
Jiuxiang Gu
Zhe Lin
Yi Wang
Yukang Chen
Yanwei Li
Jiaya Jia
160
65
0
14 Sep 2021
Knowledge Distillation Using Hierarchical Self-Supervision Augmented
  Distribution
Knowledge Distillation Using Hierarchical Self-Supervision Augmented DistributionIEEE Transactions on Neural Networks and Learning Systems (TNNLS), 2021
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
220
23
0
07 Sep 2021
Structure-Aware Hard Negative Mining for Heterogeneous Graph Contrastive
  Learning
Structure-Aware Hard Negative Mining for Heterogeneous Graph Contrastive Learning
Yanqiao Zhu
Yichen Xu
Hejie Cui
Carl Yang
Qiang Liu
Shu Wu
127
9
0
31 Aug 2021
Full-Cycle Energy Consumption Benchmark for Low-Carbon Computer Vision
Full-Cycle Energy Consumption Benchmark for Low-Carbon Computer Vision
Yue Liu
Xinyang Jiang
Donglin Bai
Yuge Zhang
Ningxin Zheng
Xuanyi Dong
Lu Liu
Yuqing Yang
Dongsheng Li
117
11
0
30 Aug 2021
Lipschitz Continuity Guided Knowledge Distillation
Lipschitz Continuity Guided Knowledge DistillationIEEE International Conference on Computer Vision (ICCV), 2021
Yuzhang Shang
Bin Duan
Ziliang Zong
Liqiang Nie
Yan Yan
161
30
0
29 Aug 2021
Compact representations of convolutional neural networks via weight
  pruning and quantization
Compact representations of convolutional neural networks via weight pruning and quantization
Giosuè Cataldo Marinò
A. Petrini
D. Malchiodi
Marco Frasca
MQ
89
4
0
28 Aug 2021
YANMTT: Yet Another Neural Machine Translation Toolkit
YANMTT: Yet Another Neural Machine Translation ToolkitAnnual Meeting of the Association for Computational Linguistics (ACL), 2021
Mary Dabre
Eiichiro Sumita
188
14
0
25 Aug 2021
Efficient Medical Image Segmentation Based on Knowledge Distillation
Efficient Medical Image Segmentation Based on Knowledge DistillationIEEE Transactions on Medical Imaging (IEEE TMI), 2021
Dian Qin
Jiajun Bu
Zhe Liu
Xin Shen
Sheng Zhou
Jingjun Gu
Zhihong Wang
Lei Wu
Hui-Fen Dai
124
148
0
23 Aug 2021
G-DetKD: Towards General Distillation Framework for Object Detectors via
  Contrastive and Semantic-guided Feature Imitation
G-DetKD: Towards General Distillation Framework for Object Detectors via Contrastive and Semantic-guided Feature Imitation
Lewei Yao
Renjie Pi
Hang Xu
Wei Zhang
Zhenguo Li
Tong Zhang
194
43
0
17 Aug 2021
Multi-granularity for knowledge distillation
Multi-granularity for knowledge distillation
Baitan Shao
Ying Chen
99
4
0
15 Aug 2021
Distilling Holistic Knowledge with Graph Neural Networks
Distilling Holistic Knowledge with Graph Neural NetworksIEEE International Conference on Computer Vision (ICCV), 2021
Sheng Zhou
Yucheng Wang
Defang Chen
Jiawei Chen
Xin Eric Wang
Can Wang
Jiajun Bu
152
66
0
12 Aug 2021
Learning an Augmented RGB Representation with Cross-Modal Knowledge
  Distillation for Action Detection
Learning an Augmented RGB Representation with Cross-Modal Knowledge Distillation for Action DetectionIEEE International Conference on Computer Vision (ICCV), 2021
Rui Dai
Srijan Das
Francois Bremond
147
47
0
08 Aug 2021
Unsupervised Cross-Modal Distillation for Thermal Infrared Tracking
Unsupervised Cross-Modal Distillation for Thermal Infrared TrackingACM Multimedia (ACM MM), 2021
Jingxian Sun
Lichao Zhang
Yufei Zha
Abel Gonzalez-Garcia
Peng Zhang
Wei Huang
Yanning Zhang
ViT
125
33
0
31 Jul 2021
Hierarchical Self-supervised Augmented Knowledge Distillation
Hierarchical Self-supervised Augmented Knowledge DistillationInternational Joint Conference on Artificial Intelligence (IJCAI), 2021
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
SSL
223
90
0
29 Jul 2021
Revisiting Catastrophic Forgetting in Class Incremental Learning
Revisiting Catastrophic Forgetting in Class Incremental Learning
Zixuan Ni
Haizhou Shi
Siliang Tang
Longhui Wei
Qi Tian
Yueting Zhuang
CLL
243
9
0
26 Jul 2021
Characterizing Generalization under Out-Of-Distribution Shifts in Deep
  Metric Learning
Characterizing Generalization under Out-Of-Distribution Shifts in Deep Metric LearningNeural Information Processing Systems (NeurIPS), 2021
Timo Milbich
Karsten Roth
Samarth Sinha
Ludwig Schmidt
Marzyeh Ghassemi
Bjorn Ommer
OODOODD
258
24
0
20 Jul 2021
Follow Your Path: a Progressive Method for Knowledge Distillation
Follow Your Path: a Progressive Method for Knowledge Distillation
Wenxian Shi
Yuxuan Song
Hao Zhou
Bohan Li
Lei Li
105
17
0
20 Jul 2021
Representation Consolidation for Training Expert Students
Representation Consolidation for Training Expert Students
Zhizhong Li
Avinash Ravichandran
Charless C. Fowlkes
M. Polito
Rahul Bhotika
Stefano Soatto
107
6
0
16 Jul 2021
Novel Visual Category Discovery with Dual Ranking Statistics and Mutual
  Knowledge Distillation
Novel Visual Category Discovery with Dual Ranking Statistics and Mutual Knowledge DistillationNeural Information Processing Systems (NeurIPS), 2021
Bingchen Zhao
Kai Han
198
135
0
07 Jul 2021
Categorical Relation-Preserving Contrastive Knowledge Distillation for
  Medical Image Classification
Categorical Relation-Preserving Contrastive Knowledge Distillation for Medical Image Classification
Xiaohan Xing
Yuenan Hou
Han Li
Yixuan Yuan
Jiaming Song
Max Meng
VLM
120
46
0
07 Jul 2021
VidLanKD: Improving Language Understanding via Video-Distilled Knowledge
  Transfer
VidLanKD: Improving Language Understanding via Video-Distilled Knowledge Transfer
Zineng Tang
Jaemin Cho
Hao Tan
Joey Tianyi Zhou
VLM
146
30
0
06 Jul 2021
On The Distribution of Penultimate Activations of Classification
  Networks
On The Distribution of Penultimate Activations of Classification Networks
Minkyo Seo
Yoonho Lee
Suha Kwak
UQCV
167
5
0
05 Jul 2021
Bag of Instances Aggregation Boosts Self-supervised Distillation
Bag of Instances Aggregation Boosts Self-supervised Distillation
Haohang Xu
Jiemin Fang
Xiaopeng Zhang
Lingxi Xie
Xinggang Wang
Wenrui Dai
H. Xiong
Qi Tian
SSL
264
25
0
04 Jul 2021
Isotonic Data Augmentation for Knowledge Distillation
Isotonic Data Augmentation for Knowledge Distillation
Wanyun Cui
Sen Yan
192
7
0
03 Jul 2021
Revisiting Knowledge Distillation: An Inheritance and Exploration
  Framework
Revisiting Knowledge Distillation: An Inheritance and Exploration FrameworkComputer Vision and Pattern Recognition (CVPR), 2021
Zhen Huang
Xu Shen
Jun Xing
Tongliang Liu
Xinmei Tian
Houqiang Li
Bing Deng
Jianqiang Huang
Xiansheng Hua
118
35
0
01 Jul 2021
Self-Contrastive Learning: Single-viewed Supervised Contrastive
  Framework using Sub-network
Self-Contrastive Learning: Single-viewed Supervised Contrastive Framework using Sub-networkAAAI Conference on Artificial Intelligence (AAAI), 2021
Sangmin Bae
Sungnyun Kim
Jongwoo Ko
Gihun Lee
SeungJong Noh
Se-Young Yun
SSL
364
8
0
29 Jun 2021
Co-advise: Cross Inductive Bias Distillation
Co-advise: Cross Inductive Bias Distillation
Sucheng Ren
Zhengqi Gao
Tianyu Hua
Zihui Xue
Yonglong Tian
Shengfeng He
Hang Zhao
184
65
0
23 Jun 2021
Simon Says: Evaluating and Mitigating Bias in Pruned Neural Networks
  with Knowledge Distillation
Simon Says: Evaluating and Mitigating Bias in Pruned Neural Networks with Knowledge Distillation
Cody Blakeney
Nathaniel Huish
Yan Yan
Ziliang Zong
103
19
0
15 Jun 2021
Distilling Image Classifiers in Object Detectors
Distilling Image Classifiers in Object DetectorsNeural Information Processing Systems (NeurIPS), 2021
Shuxuan Guo
J. Álvarez
Mathieu Salzmann
VLM
166
8
0
09 Jun 2021
BERT Learns to Teach: Knowledge Distillation with Meta Learning
BERT Learns to Teach: Knowledge Distillation with Meta LearningAnnual Meeting of the Association for Computational Linguistics (ACL), 2021
Wangchunshu Zhou
Canwen Xu
Julian McAuley
227
102
0
08 Jun 2021
Improving Event Causality Identification via Self-Supervised
  Representation Learning on External Causal Statement
Improving Event Causality Identification via Self-Supervised Representation Learning on External Causal StatementFindings (Findings), 2021
Xinyu Zuo
Pengfei Cao
Yubo Chen
Kang Liu
Jun Zhao
Weihua Peng
Yuguang Chen
139
61
0
03 Jun 2021
Fair Feature Distillation for Visual Recognition
Fair Feature Distillation for Visual RecognitionComputer Vision and Pattern Recognition (CVPR), 2021
S. Jung
Donggyu Lee
Taeeon Park
Taesup Moon
186
87
0
27 May 2021
Multiple Domain Experts Collaborative Learning: Multi-Source Domain
  Generalization For Person Re-Identification
Multiple Domain Experts Collaborative Learning: Multi-Source Domain Generalization For Person Re-Identification
Shijie Yu
Feng Zhu
Dapeng Chen
Rui Zhao
Haobin Chen
Weizhen He
Jinguo Zhu
Yu Qiao
OOD
174
18
0
26 May 2021
Towards Compact Single Image Super-Resolution via Contrastive
  Self-distillation
Towards Compact Single Image Super-Resolution via Contrastive Self-distillationInternational Joint Conference on Artificial Intelligence (IJCAI), 2021
Yanbo Wang
Shaohui Lin
Yanyun Qu
Haiyan Wu
Zhizhong Zhang
Yuan Xie
Angela Yao
SupR
165
55
0
25 May 2021
Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in
  Knowledge Distillation
Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge DistillationInternational Joint Conference on Artificial Intelligence (IJCAI), 2021
Taehyeon Kim
Jaehoon Oh
Nakyil Kim
Sangwook Cho
Se-Young Yun
141
282
0
19 May 2021
VPN++: Rethinking Video-Pose embeddings for understanding Activities of
  Daily Living
VPN++: Rethinking Video-Pose embeddings for understanding Activities of Daily LivingIEEE Transactions on Pattern Analysis and Machine Intelligence (TPAMI), 2021
Srijan Das
Rui Dai
Di Yang
Francois Bremond
ViT
234
82
0
17 May 2021
Divide and Contrast: Self-supervised Learning from Uncurated Data
Divide and Contrast: Self-supervised Learning from Uncurated DataIEEE International Conference on Computer Vision (ICCV), 2021
Yonglong Tian
Olivier J. Hénaff
Aaron van den Oord
SSL
264
110
0
17 May 2021
Graph-Free Knowledge Distillation for Graph Neural Networks
Graph-Free Knowledge Distillation for Graph Neural NetworksInternational Joint Conference on Artificial Intelligence (IJCAI), 2021
Xiang Deng
Zhongfei Zhang
156
79
0
16 May 2021
KDExplainer: A Task-oriented Attention Model for Explaining Knowledge
  Distillation
KDExplainer: A Task-oriented Attention Model for Explaining Knowledge DistillationInternational Joint Conference on Artificial Intelligence (IJCAI), 2021
Mengqi Xue
Mingli Song
Xinchao Wang
Pengcheng Chen
Xingen Wang
Xiuming Zhang
165
14
0
10 May 2021
Previous
123...1011121314
Next