Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2211.10156
Cited By
DETRDistill: A Universal Knowledge Distillation Framework for DETR-families
17 November 2022
Jiahao Chang
Shuo Wang
Guangkai Xu
Zehui Chen
Chenhongyi Yang
Fengshang Zhao
Re-assign community
ArXiv
PDF
HTML
Papers citing
"DETRDistill: A Universal Knowledge Distillation Framework for DETR-families"
8 / 8 papers shown
Title
VRM: Knowledge Distillation via Virtual Relation Matching
W. Zhang
Fei Xie
Weidong Cai
Chao Ma
68
0
0
28 Feb 2025
Active Object Detection with Knowledge Aggregation and Distillation from Large Models
Dejie Yang
Yang Liu
32
3
0
21 May 2024
SnapCap: Efficient Snapshot Compressive Video Captioning
Jianqiao Sun
Yudi Su
Hao Zhang
Ziheng Cheng
Zequn Zeng
Zhengjue Wang
Bo Chen
Xin Yuan
22
1
0
10 Jan 2024
DAB-DETR: Dynamic Anchor Boxes are Better Queries for DETR
Shilong Liu
Feng Li
Hao Zhang
X. Yang
Xianbiao Qi
Hang Su
Jun Zhu
Lei Zhang
ViT
138
703
0
28 Jan 2022
ViDT: An Efficient and Effective Fully Transformer-based Object Detector
Hwanjun Song
Deqing Sun
Sanghyuk Chun
Varun Jampani
Dongyoon Han
Byeongho Heo
Wonjae Kim
Ming-Hsuan Yang
78
75
0
08 Oct 2021
Visformer: The Vision-friendly Transformer
Zhengsu Chen
Lingxi Xie
Jianwei Niu
Xuefeng Liu
Longhui Wei
Qi Tian
ViT
109
206
0
26 Apr 2021
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
147
416
0
19 Apr 2021
Localization Distillation for Dense Object Detection
Zhaohui Zheng
Rongguang Ye
Ping Wang
Dongwei Ren
W. Zuo
Qibin Hou
Ming-Ming Cheng
ObjD
96
111
0
24 Feb 2021
1