Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
All Papers
0 / 0 papers shown
Title
Home
Papers
1910.10699
Cited By
v1
v2
v3 (latest)
Contrastive Representation Distillation
International Conference on Learning Representations (ICLR), 2019
23 October 2019
Yonglong Tian
Dilip Krishnan
Phillip Isola
Re-assign community
ArXiv (abs)
PDF
HTML
Github (2336★)
Papers citing
"Contrastive Representation Distillation"
50 / 686 papers shown
Title
Mutual Information Maximization on Disentangled Representations for Differential Morph Detection
IEEE Workshop/Winter Conference on Applications of Computer Vision (WACV), 2020
Sobhan Soleymani
Ali Dabouei
Fariborz Taherkhani
J. Dawson
Nasser M. Nasrabadi
161
29
0
02 Dec 2020
Multi-level Knowledge Distillation via Knowledge Alignment and Correlation
Fei Ding
Yin Yang
Hongxin Hu
Venkat Krovi
Feng Luo
147
5
0
01 Dec 2020
torchdistill: A Modular, Configuration-Driven Framework for Knowledge Distillation
International Workshop on Reproducible Research in Pattern Recognition (RRPR), 2020
Yoshitomo Matsubara
221
25
0
25 Nov 2020
Learnable Boundary Guided Adversarial Training
IEEE International Conference on Computer Vision (ICCV), 2020
Jiequan Cui
Shu Liu
Liwei Wang
Jiaya Jia
OOD
AAML
285
145
0
23 Nov 2020
Distill2Vec: Dynamic Graph Representation Learning with Knowledge Distillation
Stefanos Antaris
Dimitrios Rafailidis
169
3
0
11 Nov 2020
On Self-Distilling Graph Neural Network
Y. Chen
Yatao Bian
Xi Xiao
Yu Rong
Qifeng Bai
Junzhou Huang
FedML
169
57
0
04 Nov 2020
Distilling Knowledge by Mimicking Features
G. Wang
Yifan Ge
Jianxin Wu
331
52
0
03 Nov 2020
Pixel-Level Cycle Association: A New Perspective for Domain Adaptive Semantic Segmentation
Neural Information Processing Systems (NeurIPS), 2020
Guoliang Kang
Yunchao Wei
Yi Yang
Yueting Zhuang
Alexander G. Hauptmann
212
120
0
31 Oct 2020
CompRess: Self-Supervised Learning by Compressing Representations
Neural Information Processing Systems (NeurIPS), 2020
Soroush Abbasi Koohpayegani
Ajinkya Tejankar
Hamed Pirsiavash
SSL
223
97
0
28 Oct 2020
Empowering Knowledge Distillation via Open Set Recognition for Robust 3D Point Cloud Classification
Pattern Recognition Letters (Pattern Recognit. Lett.), 2020
Ayush Bhardwaj
Sakshee Pimpale
S. Kumar
Biplab Banerjee
3DPC
125
16
0
25 Oct 2020
Distributed Representations of Entities in Open-World Knowledge Graphs
Lingbing Guo
Zhuo Chen
Jiaoyan Chen
Weiqing Wang
Zequn Sun
Zhongpo Bo
Yin Fang
Chenghao Liu
Huajun Chen
Wei Hu
GNN
150
16
0
16 Oct 2020
Reducing the Teacher-Student Gap via Spherical Knowledge Disitllation
Jia Guo
Minghao Chen
Yao Hu
Chen Zhu
Xiaofei He
Deng Cai
171
6
0
15 Oct 2020
Contrastive Representation Learning: A Framework and Review
IEEE Access (IEEE Access), 2020
Phúc H. Lê Khắc
Graham Healy
Alan F. Smeaton
SSL
AI4TS
510
828
0
10 Oct 2020
Locally Linear Region Knowledge Distillation
Xiang Deng
Zhongfei Zhang
Zhang
101
0
0
09 Oct 2020
Long-tailed Recognition by Routing Diverse Distribution-Aware Experts
International Conference on Learning Representations (ICLR), 2020
Xudong Wang
Long Lian
Zhongqi Miao
Ziwei Liu
Stella X. Yu
460
451
0
05 Oct 2020
Improved Knowledge Distillation via Full Kernel Matrix Transfer
Qi Qian
Hao Li
Juhua Hu
102
7
0
30 Sep 2020
Contrastive Distillation on Intermediate Representations for Language Model Compression
S. Sun
Zhe Gan
Yu Cheng
Yuwei Fang
Shuohang Wang
Jingjing Liu
VLM
202
80
0
29 Sep 2020
Feature Adaptation of Pre-Trained Language Models across Languages and Domains with Robust Self-Training
Hai Ye
Qingyu Tan
Ruidan He
Juntao Li
Hwee Tou Ng
Lidong Bing
VLM
160
7
0
24 Sep 2020
Feature Distillation With Guided Adversarial Contrastive Learning
Tao Bai
Jinnan Chen
Jun Zhao
Bihan Wen
Xudong Jiang
Alex C. Kot
AAML
122
9
0
21 Sep 2020
Densely Guided Knowledge Distillation using Multiple Teacher Assistants
IEEE International Conference on Computer Vision (ICCV), 2020
Wonchul Son
Jaemin Na
Junyong Choi
Wonjun Hwang
297
144
0
18 Sep 2020
MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks
Zhiqiang Shen
Marios Savvides
241
68
0
17 Sep 2020
S2SD: Simultaneous Similarity-based Self-Distillation for Deep Metric Learning
International Conference on Machine Learning (ICML), 2020
Karsten Roth
Timo Milbich
Bjorn Ommer
Joseph Paul Cohen
Marzyeh Ghassemi
FedML
295
16
0
17 Sep 2020
DualDE: Dually Distilling Knowledge Graph Embedding for Faster and Cheaper Reasoning
Web Search and Data Mining (WSDM), 2020
Yushan Zhu
Wen Zhang
Yin Hua
Hui Chen
Xu-Xin Cheng
Wei Zhang
Huajun Chen Zhejiang University
147
38
0
13 Sep 2020
Intra-Utterance Similarity Preserving Knowledge Distillation for Audio Tagging
Interspeech (Interspeech), 2020
Chun-Chieh Chang
Chieh-Chi Kao
Ming Sun
Chao Wang
102
5
0
03 Sep 2020
SAIL: Self-Augmented Graph Contrastive Learning
AAAI Conference on Artificial Intelligence (AAAI), 2020
Lu Yu
Shichao Pei
Lizhong Ding
Jun Zhou
Longfei Li
Chuxu Zhang
Xiangliang Zhang
SSL
273
43
0
02 Sep 2020
Knowledge Transfer via Dense Cross-Layer Mutual-Distillation
Anbang Yao
Dawei Sun
138
61
0
18 Aug 2020
Prime-Aware Adaptive Distillation
Youcai Zhang
Zhonghao Lan
Yuchen Dai
Fangao Zeng
Yan Bai
Jie Chang
Yichen Wei
154
49
0
04 Aug 2020
ALF: Autoencoder-based Low-rank Filter-sharing for Efficient Convolutional Neural Networks
Design Automation Conference (DAC), 2020
Alexander Frickenstein
M. Vemparala
Nael Fasfous
Laura Hauenschild
N. S. Nagaraja
C. Unger
W. Stechele
139
5
0
27 Jul 2020
Multi-label Contrastive Predictive Coding
Neural Information Processing Systems (NeurIPS), 2020
Jiaming Song
Stefano Ermon
SSL
VLM
163
52
0
20 Jul 2020
Hybrid Discriminative-Generative Training via Contrastive Learning
Hao Liu
Pieter Abbeel
SSL
180
42
0
17 Jul 2020
Knowledge Distillation for Multi-task Learning
Weihong Li
Hakan Bilen
MoMe
208
88
0
14 Jul 2020
Representation Transfer by Optimal Transport
Xuhong Li
Yves Grandvalet
Rémi Flamary
Nicolas Courty
Dejing Dou
OT
147
8
0
13 Jul 2020
Impression Space from Deep Template Network
Gongfan Fang
Xinchao Wang
Haofei Zhang
Mingli Song
Xiuming Zhang
124
0
0
10 Jul 2020
Interactive Knowledge Distillation
Shipeng Fu
Zhen Li
Jun Xu
Ming-Ming Cheng
Gwanggil Jeon
Xiaomin Yang
161
6
0
03 Jul 2020
On the Demystification of Knowledge Distillation: A Residual Network Perspective
N. Jha
Rajat Saini
Sparsh Mittal
108
4
0
30 Jun 2020
ContraGAN: Contrastive Learning for Conditional Image Generation
Minguk Kang
Jaesik Park
GAN
160
2
0
23 Jun 2020
Multi-fidelity Neural Architecture Search with Knowledge Distillation
I. Trofimov
Nikita Klyuchnikov
Mikhail Salnikov
Alexander N. Filippov
Evgeny Burnaev
313
20
0
15 Jun 2020
DTG-Net: Differentiated Teachers Guided Self-Supervised Video Action Recognition
Ziming Liu
Guangyu Gao
•. A. K. Qin
Jinyang Li
ViT
154
1
0
13 Jun 2020
Ensemble Distillation for Robust Model Fusion in Federated Learning
Neural Information Processing Systems (NeurIPS), 2020
Tao Lin
Lingjing Kong
Sebastian U. Stich
Martin Jaggi
FedML
465
1,280
0
12 Jun 2020
Knowledge Distillation Meets Self-Supervision
European Conference on Computer Vision (ECCV), 2020
Guodong Xu
Ziwei Liu
Xiaoxiao Li
Chen Change Loy
FedML
277
315
0
12 Jun 2020
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
1.4K
3,607
0
09 Jun 2020
C-SL: Contrastive Sound Localization with Inertial-Acoustic Sensors
Majid Mirbagheri
Bardia Doosti
100
2
0
09 Jun 2020
ResKD: Residual-Guided Knowledge Distillation
Xuewei Li
Songyuan Li
Bourahla Omar
Leilei Gan
Xi Li
224
56
0
08 Jun 2020
Multi-view Contrastive Learning for Online Knowledge Distillation
Chuanguang Yang
Zhulin An
Yongjun Xu
189
26
0
07 Jun 2020
An Overview of Neural Network Compression
James OÑeill
AI4CE
288
110
0
05 Jun 2020
Channel Distillation: Channel-Wise Attention for Knowledge Distillation
Zaida Zhou
Chaoran Zhuge
Xinwei Guan
Wen Liu
152
56
0
02 Jun 2020
What Makes for Good Views for Contrastive Learning?
Yonglong Tian
Chen Sun
Ben Poole
Dilip Krishnan
Cordelia Schmid
Phillip Isola
SSL
331
1,470
0
20 May 2020
Learning from a Lightweight Teacher for Efficient Knowledge Distillation
Yuang Liu
Wei Zhang
Jun Wang
126
4
0
19 May 2020
Distilling Spikes: Knowledge Distillation in Spiking Neural Networks
International Conference on Pattern Recognition (ICPR), 2020
R. K. Kushawaha
S. Kumar
Biplab Banerjee
R. Velmurugan
92
38
0
01 May 2020
Teacher-Class Network: A Neural Network Compression Mechanism
British Machine Vision Conference (BMVC), 2020
Shaiq Munir Malik
Muhammad Umair Haider
Fnu Mohbat
Musab Rasheed
M. Taj
266
5
0
07 Apr 2020
Previous
1
2
3
...
12
13
14
Next