Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2304.11832
Cited By
Function-Consistent Feature Distillation
24 April 2023
Dongyang Liu
Meina Kan
Shiguang Shan
Xilin Chen
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Function-Consistent Feature Distillation"
6 / 6 papers shown
Title
VRM: Knowledge Distillation via Virtual Relation Matching
W. Zhang
Fei Xie
Weidong Cai
Chao Ma
68
0
0
28 Feb 2025
Self-supervised Video Object Segmentation with Distillation Learning of Deformable Attention
Quang-Trung Truong
Duc Thanh Nguyen
Binh-Son Hua
Sai-Kit Yeung
VOS
28
1
0
25 Jan 2024
CrossKD: Cross-Head Knowledge Distillation for Object Detection
Jiabao Wang
Yuming Chen
Zhaohui Zheng
Xiang Li
Ming-Ming Cheng
Qibin Hou
35
30
0
20 Jun 2023
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
144
308
0
19 Apr 2021
Show, Attend and Distill:Knowledge Distillation via Attention-based Feature Matching
Mingi Ji
Byeongho Heo
Sungrae Park
56
140
0
05 Feb 2021
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
948
20,214
0
17 Apr 2017
1