Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1910.10699
Cited By
Contrastive Representation Distillation
23 October 2019
Yonglong Tian
Dilip Krishnan
Phillip Isola
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Contrastive Representation Distillation"
50 / 611 papers shown
Title
Learning from Rich Semantics and Coarse Locations for Long-tailed Object Detection
Lingchen Meng
Xiyang Dai
Jianwei Yang
Dongdong Chen
Yinpeng Chen
Mengchen Liu
Yi-Ling Chen
Zuxuan Wu
Lu Yuan
Yu-Gang Jiang
10
6
0
18 Oct 2023
Getting aligned on representational alignment
Ilia Sucholutsky
Lukas Muttenthaler
Adrian Weller
Andi Peng
Andreea Bobu
...
Thomas Unterthiner
Andrew Kyle Lampinen
Klaus-Robert Muller
M. Toneva
Thomas L. Griffiths
56
74
0
18 Oct 2023
Exploiting User Comments for Early Detection of Fake News Prior to Users' Commenting
Qiong Nan
Qiang Sheng
Juan Cao
Yongchun Zhu
Danding Wang
Guang Yang
Jintao Li
Kai Shu
33
8
0
16 Oct 2023
Learning Unified Representations for Multi-Resolution Face Recognition
Hulingxiao He
Wu Yuan
Yidian Huang
Shilong Zhao
Wen Yuan
Hanqin Li
CVBM
10
0
0
14 Oct 2023
Towards the Fundamental Limits of Knowledge Transfer over Finite Domains
Qingyue Zhao
Banghua Zhu
17
4
0
11 Oct 2023
Fast and Robust Early-Exiting Framework for Autoregressive Language Models with Synchronized Parallel Decoding
Sangmin Bae
Jongwoo Ko
Hwanjun Song
SeYoung Yun
22
53
0
09 Oct 2023
OpenIncrement: A Unified Framework for Open Set Recognition and Deep Class-Incremental Learning
Jiawen Xu
Claas Grohnfeldt
O. Kao
BDL
CLL
13
2
0
05 Oct 2023
LumiNet: The Bright Side of Perceptual Knowledge Distillation
Md. Ismail Hossain
M. M. L. Elahi
Sameera Ramasinghe
A. Cheraghian
Fuad Rahman
Nabeel Mohammed
Shafin Rahman
24
1
0
05 Oct 2023
Continual Contrastive Spoken Language Understanding
Umberto Cappellazzo
Enrico Fini
Muqiao Yang
Daniele Falavigna
A. Brutti
Bhiksha Raj
CLL
23
1
0
04 Oct 2023
Improving Knowledge Distillation with Teacher's Explanation
S. Chowdhury
Ben Liang
A. Tizghadam
Ilijc Albanese
FAtt
8
0
0
04 Oct 2023
Heterogeneous Federated Learning Using Knowledge Codistillation
Jared Lichtarge
Ehsan Amid
Shankar Kumar
Tien-Ju Yang
Rohan Anil
Rajiv Mathews
FedML
26
0
0
04 Oct 2023
Generalizable Heterogeneous Federated Cross-Correlation and Instance Similarity Learning
Wenke Huang
J. J. Valero-Mas
Dasaem Jeong
Bo Du
FedML
25
44
0
28 Sep 2023
VideoAdviser: Video Knowledge Distillation for Multimodal Transfer Learning
Yanan Wang
Donghuo Zeng
Shinya Wada
Satoshi Kurihara
32
6
0
27 Sep 2023
Weight Averaging Improves Knowledge Distillation under Domain Shift
Valeriy Berezovskiy
Nikita Morozov
MoMe
19
1
0
20 Sep 2023
Heterogeneous Generative Knowledge Distillation with Masked Image Modeling
Ziming Wang
Shumin Han
Xiaodi Wang
Jing Hao
Xianbin Cao
Baochang Zhang
VLM
27
0
0
18 Sep 2023
Quality-Agnostic Deepfake Detection with Intra-model Collaborative Learning
B. Le
Simon S. Woo
AAML
24
27
0
12 Sep 2023
3D Denoisers are Good 2D Teachers: Molecular Pretraining via Denoising and Cross-Modal Distillation
Sungjun Cho
Dae-Woong Jeong
Sung Moon Ko
Jinwoo Kim
Sehui Han
Seunghoon Hong
Honglak Lee
Moontae Lee
AI4CE
DiffM
30
1
0
08 Sep 2023
Rethinking Momentum Knowledge Distillation in Online Continual Learning
Nicolas Michel
Maorong Wang
L. Xiao
T. Yamasaki
CLL
19
8
0
06 Sep 2023
Knowledge Distillation Layer that Lets the Student Decide
Ada Gorgun
Y. Z. Gürbüz
Aydin Alatan
21
0
0
06 Sep 2023
Code Representation Pre-training with Complements from Program Executions
Jiabo Huang
Jianyu Zhao
Yuyang Rong
Yiwen Guo
Yifeng He
Hao Chen
16
4
0
04 Sep 2023
MoMA: Momentum Contrastive Learning with Multi-head Attention-based Knowledge Distillation for Histopathology Image Analysis
T. Vuong
J. T. Kwak
33
6
0
31 Aug 2023
SynthDistill: Face Recognition with Knowledge Distillation from Synthetic Data
Hatef Otroshi
Anjith George
S´ebastien Marcel
30
11
0
28 Aug 2023
Efficient View Synthesis with Neural Radiance Distribution Field
Yushuang Wu
Xiao Li
Jinglu Wang
Xiaoguang Han
Shuguang Cui
Yan Lu
18
1
0
22 Aug 2023
Diffusion Model as Representation Learner
Xingyi Yang
Xinchao Wang
DiffM
25
53
0
21 Aug 2023
GiGaMAE: Generalizable Graph Masked Autoencoder via Collaborative Latent Space Reconstruction
Yucheng Shi
Yushun Dong
Qiaoyu Tan
Jundong Li
Ninghao Liu
35
24
0
18 Aug 2023
Unlimited Knowledge Distillation for Action Recognition in the Dark
Ruibing Jin
Guosheng Lin
Min-man Wu
Jie Lin
Zhengguo Li
Xiaoli Li
Zhenghua Chen
8
1
0
18 Aug 2023
CCFace: Classification Consistency for Low-Resolution Face Recognition
Mohammad Saeed Ebrahimi Saadabadi
Sahar Rahimi Malakshan
Hossein Kashiani
Nasser M. Nasrabadi
CVBM
17
4
0
18 Aug 2023
ZhiJian: A Unifying and Rapidly Deployable Toolbox for Pre-trained Model Reuse
Yi-Kai Zhang
Lu Ren
Chao Yi
Qiwen Wang
De-Chuan Zhan
Han-Jia Ye
16
2
0
17 Aug 2023
Learning to Distill Global Representation for Sparse-View CT
Zilong Li
Chenglong Ma
Jie Chen
Junping Zhang
Hongming Shan
21
9
0
16 Aug 2023
Story Visualization by Online Text Augmentation with Context Memory
Daechul Ahn
Daneul Kim
Gwangmo Song
Seung Wook Kim
Honglak Lee
Dongyeop Kang
Jonghyun Choi
DiffM
16
4
0
15 Aug 2023
Estimator Meets Equilibrium Perspective: A Rectified Straight Through Estimator for Binary Neural Networks Training
Xiao-Ming Wu
Dian Zheng
Zuhao Liu
Weishi Zheng
MQ
34
16
0
13 Aug 2023
Multi-Label Knowledge Distillation
Penghui Yang
Ming-Kun Xie
Chen-Chen Zong
Lei Feng
Gang Niu
Masashi Sugiyama
Sheng-Jun Huang
28
10
0
12 Aug 2023
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
21
16
0
08 Aug 2023
NormKD: Normalized Logits for Knowledge Distillation
Zhihao Chi
Tu Zheng
Hengjia Li
Zheng Yang
Boxi Wu
Binbin Lin
D. Cai
25
13
0
01 Aug 2023
Sampling to Distill: Knowledge Transfer from Open-World Data
Yuzheng Wang
Zhaoyu Chen
Jie M. Zhang
Dingkang Yang
Zuhao Ge
Yang Liu
Siao Liu
Yunquan Sun
Wenqiang Zhang
Lizhe Qi
26
9
0
31 Jul 2023
Class-relation Knowledge Distillation for Novel Class Discovery
Peiyan Gu
Chuyu Zhang
Rui Xu
Xuming He
29
15
0
18 Jul 2023
Cumulative Spatial Knowledge Distillation for Vision Transformers
Borui Zhao
Renjie Song
Jiajun Liang
13
14
0
17 Jul 2023
DOT: A Distillation-Oriented Trainer
Borui Zhao
Quan Cui
Renjie Song
Jiajun Liang
11
4
0
17 Jul 2023
The Staged Knowledge Distillation in Video Classification: Harmonizing Student Progress by a Complementary Weakly Supervised Framework
Chao Wang
Zhenghang Tang
19
1
0
11 Jul 2023
Customizing Synthetic Data for Data-Free Student Learning
Shiya Luo
Defang Chen
Can Wang
12
2
0
10 Jul 2023
Distilling Large Vision-Language Model with Out-of-Distribution Generalizability
Xuanlin Li
Yunhao Fang
Minghua Liu
Z. Ling
Z. Tu
Haoran Su
VLM
28
23
0
06 Jul 2023
Review helps learn better: Temporal Supervised Knowledge Distillation
Dongwei Wang
Zhi-Long Han
Yanmei Wang
Xi’ai Chen
Baicheng Liu
Yandong Tang
52
1
0
03 Jul 2023
A Dimensional Structure based Knowledge Distillation Method for Cross-Modal Learning
Lingyu Si
Hongwei Dong
Wenwen Qiang
J. Yu
Wen-jie Zhai
Changwen Zheng
Fanjiang Xu
Fuchun Sun
16
1
0
28 Jun 2023
Hybrid Distillation: Connecting Masked Autoencoders with Contrastive Learners
Bowen Shi
Xiaopeng Zhang
Yaoming Wang
Jin Li
Wenrui Dai
Junni Zou
H. Xiong
Qi Tian
37
4
0
28 Jun 2023
Enhancing Mapless Trajectory Prediction through Knowledge Distillation
Yuning Wang
Pu Zhang
Lei Bai
Jianru Xue
14
4
0
25 Jun 2023
NetBooster: Empowering Tiny Deep Learning By Standing on the Shoulders of Deep Giants
Zhongzhi Yu
Y. Fu
Jiayi Yuan
Haoran You
Yingyan Lin
18
1
0
23 Jun 2023
Categories of Response-Based, Feature-Based, and Relation-Based Knowledge Distillation
Chuanguang Yang
Xinqiang Yu
Zhulin An
Yongjun Xu
VLM
OffRL
86
21
0
19 Jun 2023
Enhanced Multimodal Representation Learning with Cross-modal KD
Mengxi Chen
Linyu Xing
Yu Wang
Ya-Qin Zhang
20
11
0
13 Jun 2023
Adaptive Multi-Teacher Knowledge Distillation with Meta-Learning
Hailin Zhang
Defang Chen
Can Wang
10
12
0
11 Jun 2023
CALICO: Self-Supervised Camera-LiDAR Contrastive Pre-training for BEV Perception
Jiachen Sun
Haizhong Zheng
Qingzhao Zhang
Atul Prakash
Z. Morley Mao
Chaowei Xiao
SSL
16
10
0
01 Jun 2023
Previous
1
2
3
4
5
...
11
12
13
Next