Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2404.14109
Cited By
CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective
22 April 2024
Wencheng Zhu
Xin Zhou
Pengfei Zhu
Yu Wang
Qinghua Hu
VLM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective"
5 / 5 papers shown
Title
Gap Preserving Distillation by Building Bidirectional Mappings with A Dynamic Teacher
Yong Guo
Shulian Zhang
Haolin Pan
Jing Liu
Yulun Zhang
Jian Chen
30
0
0
05 Oct 2024
One-for-All: Bridge the Gap Between Heterogeneous Architectures in Knowledge Distillation
Zhiwei Hao
Jianyuan Guo
Kai Han
Yehui Tang
Han Hu
Yunhe Wang
Chang Xu
36
55
0
30 Oct 2023
PointDistiller: Structured Knowledge Distillation Towards Efficient and Compact 3D Detection
Linfeng Zhang
Runpei Dong
Hung-Shuo Tai
Kaisheng Ma
3DPC
67
42
0
23 May 2022
Decoupled Contrastive Learning
Chun-Hsiao Yeh
Cheng-Yao Hong
Yen-Chi Hsu
Tyng-Luh Liu
Yubei Chen
Yann LeCun
165
181
0
13 Oct 2021
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
147
416
0
19 Apr 2021
1