Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2103.16367
Cited By
Complementary Relation Contrastive Distillation
29 March 2021
Jinguo Zhu
Shixiang Tang
Dapeng Chen
Shijie Yu
Yakun Liu
A. Yang
M. Rong
Xiaohua Wang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Complementary Relation Contrastive Distillation"
8 / 8 papers shown
Title
Rethinking Knowledge in Distillation: An In-context Sample Retrieval Perspective
Jinjing Zhu
Songze Li
Lin Wang
47
0
0
13 Jan 2025
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
21
16
0
08 Aug 2023
Generalized Knowledge Distillation via Relationship Matching
Han-Jia Ye
Su Lu
De-Chuan Zhan
FedML
22
20
0
04 May 2022
R-DFCIL: Relation-Guided Representation Learning for Data-Free Class Incremental Learning
Qiankun Gao
Chen Zhao
Bernard Ghanem
Jian Zhang
CLL
20
61
0
24 Mar 2022
Exploring Patch-wise Semantic Relation for Contrastive Learning in Image-to-Image Translation Tasks
Chanyong Jung
Gihyun Kwon
Jong Chul Ye
24
84
0
03 Mar 2022
Anomaly Detection via Reverse Distillation from One-Class Embedding
Hanqiu Deng
Xingyu Li
UQCV
111
448
0
26 Jan 2022
Optimizing for In-memory Deep Learning with Emerging Memory Technology
Zhehui Wang
Tao Luo
Rick Siow Mong Goh
Wei Zhang
Weng-Fai Wong
13
1
0
01 Dec 2021
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
950
20,561
0
17 Apr 2017
1