Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2112.04840
Cited By
Knowledge Distillation for Object Detection via Rank Mimicking and Prediction-guided Feature Imitation
9 December 2021
Gang Li
Xiang Li
Yujie Wang
Shanshan Zhang
Yichao Wu
Ding Liang
ObjD
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Knowledge Distillation for Object Detection via Rank Mimicking and Prediction-guided Feature Imitation"
32 / 32 papers shown
Title
Image Recognition with Online Lightweight Vision Transformer: A Survey
Zherui Zhang
Rongtao Xu
Jie Zhou
Changwei Wang
Xingtian Pei
...
Jiguang Zhang
Li Guo
Longxiang Gao
W. Xu
Shibiao Xu
ViT
48
0
0
06 May 2025
Feature Alignment and Representation Transfer in Knowledge Distillation for Large Language Models
Junjie Yang
Junhao Song
Xudong Han
Ziqian Bi
Tianyang Wang
...
Y. Zhang
Qian Niu
Benji Peng
Keyu Chen
Ming Liu
VLM
40
0
0
18 Apr 2025
Random Conditioning with Distillation for Data-Efficient Diffusion Model Compression
Dohyun Kim
S. Park
Geonhee Han
Seung Wook Kim
Paul Hongsuck Seo
DiffM
45
0
0
02 Apr 2025
Dynamic Textual Prompt For Rehearsal-free Lifelong Person Re-identification
Hongyu Chen
Bingliang Jiao
Wenxuan Wang
Peng Wang
VLM
31
0
0
09 Nov 2024
Kendall's
τ
τ
τ
Coefficient for Logits Distillation
Yuchen Guan
Runxi Cheng
Kang Liu
Chun Yuan
21
0
0
26 Sep 2024
Knowledge Distillation via Query Selection for Detection Transformer
Yi Liu
Luting Wang
Zongheng Tang
Yue Liao
Yifan Sun
Lijun Zhang
Si Liu
21
0
0
10 Sep 2024
Estimating Human Poses Across Datasets: A Unified Skeleton and Multi-Teacher Distillation Approach
Muhammad Gul Zain Ali Khan
Dhavalkumar Limbachiya
Didier Stricker
Muhammad Zeshan Afzal
3DH
18
0
0
30 May 2024
Aligning in a Compact Space: Contrastive Knowledge Distillation between Heterogeneous Architectures
Hongjun Wu
Li Xiao
Xingkuo Zhang
Yining Miao
31
1
0
28 May 2024
Why does Knowledge Distillation Work? Rethink its Attention and Fidelity Mechanism
Chenqi Guo
Shiwei Zhong
Xiaofeng Liu
Qianli Feng
Yinglong Ma
14
1
0
30 Apr 2024
Task Integration Distillation for Object Detectors
Hai Su
ZhenWen Jian
Songsen Yu
33
1
0
02 Apr 2024
Logit Standardization in Knowledge Distillation
Shangquan Sun
Wenqi Ren
Jingzhi Li
Rui Wang
Xiaochun Cao
29
55
0
03 Mar 2024
Generative Denoise Distillation: Simple Stochastic Noises Induce Efficient Knowledge Transfer for Dense Prediction
Zhaoge Liu
Xiaohao Xu
Yunkang Cao
Weiming Shen
VLM
16
0
0
16 Jan 2024
RdimKD: Generic Distillation Paradigm by Dimensionality Reduction
Yi Guo
Yiqian He
Xiaoyang Li
Haotong Qin
Van Tung Pham
Yang Zhang
Shouda Liu
29
1
0
14 Dec 2023
Towards Generalized Multi-stage Clustering: Multi-view Self-distillation
Jiatai Wang
Zhiwei Xu
Xin Wang
Tao Li
11
1
0
29 Oct 2023
Bidirectional Knowledge Reconfiguration for Lightweight Point Cloud Analysis
Peipei Li
Xing Cui
Yibo Hu
Man Zhang
Ting Yao
Tao Mei
12
0
0
08 Oct 2023
LumiNet: The Bright Side of Perceptual Knowledge Distillation
Md. Ismail Hossain
M. M. L. Elahi
Sameera Ramasinghe
A. Cheraghian
Fuad Rahman
Nabeel Mohammed
Shafin Rahman
21
1
0
05 Oct 2023
Bridging Cross-task Protocol Inconsistency for Distillation in Dense Object Detection
Longrong Yang
Xianpan Zhou
Xuewei Li
Liang Qiao
Zheyang Li
Zi-Liang Yang
Gaoang Wang
Xi Li
16
16
0
28 Aug 2023
Cyclic-Bootstrap Labeling for Weakly Supervised Object Detection
Yufei Yin
Jiajun Deng
Wen-gang Zhou
Li Li
Houqiang Li
22
3
0
11 Aug 2023
CrossKD: Cross-Head Knowledge Distillation for Object Detection
Jiabao Wang
Yuming Chen
Zhaohui Zheng
Xiang Li
Ming-Ming Cheng
Qibin Hou
38
30
0
20 Jun 2023
A Comprehensive Study on Object Detection Techniques in Unconstrained Environments
Hrishitva Patel
ObjD
8
2
0
11 Apr 2023
A Simple and Generic Framework for Feature Distillation via Channel-wise Transformation
Ziwei Liu
Yongtao Wang
Xiaojie Chu
19
5
0
23 Mar 2023
Adversarial Attacks and Defenses in Machine Learning-Powered Networks: A Contemporary Survey
Yulong Wang
Tong Sun
Shenghong Li
Xinnan Yuan
W. Ni
E. Hossain
H. Vincent Poor
AAML
18
17
0
11 Mar 2023
Smooth and Stepwise Self-Distillation for Object Detection
Jieren Deng
Xiaoxia Zhou
Hao Tian
Zhihong Pan
Derek Aguiar
ObjD
8
0
0
09 Mar 2023
Curriculum Temperature for Knowledge Distillation
Zheng Li
Xiang Li
Lingfeng Yang
Borui Zhao
Renjie Song
Lei Luo
Jun Yu Li
Jian Yang
22
130
0
29 Nov 2022
Teach-DETR: Better Training DETR with Teachers
Linjiang Huang
Kaixin Lu
Guanglu Song
Liang Wang
Siyu Liu
Yu Liu
Hongsheng Li
25
9
0
22 Nov 2022
Distilling Object Detectors With Global Knowledge
Sanli Tang
Zhongyu Zhang
Zhanzhan Cheng
Jing Lu
Yunlu Xu
Yi Niu
Fan He
18
8
0
17 Oct 2022
ViTKD: Practical Guidelines for ViT feature knowledge distillation
Zhendong Yang
Zhe Li
Ailing Zeng
Zexian Li
Chun Yuan
Yu Li
81
42
0
06 Sep 2022
Task-Balanced Distillation for Object Detection
Ruining Tang
Zhen-yu Liu
Yangguang Li
Yiguo Song
Hui Liu
Qide Wang
Jing Shao
Guifang Duan
Jianrong Tan
16
20
0
05 Aug 2022
DTG-SSOD: Dense Teacher Guidance for Semi-Supervised Object Detection
Gang Li
Xiang Li
Yujie Wang
Yichao Wu
Ding Liang
Shanshan Zhang
17
25
0
12 Jul 2022
PKD: General Distillation Framework for Object Detectors via Pearson Correlation Coefficient
Weihan Cao
Yifan Zhang
Jianfei Gao
Anda Cheng
Ke Cheng
Jian Cheng
15
62
0
05 Jul 2022
Localization Distillation for Object Detection
Zhaohui Zheng
Rongguang Ye
Ping Wang
Dongwei Ren
Jun Wang
W. Zuo
Ming-Ming Cheng
19
63
0
12 Apr 2022
PseCo: Pseudo Labeling and Consistency Training for Semi-Supervised Object Detection
Gang Li
Xiang Li
Yujie Wang
Yichao Wu
Ding Liang
Shanshan Zhang
12
91
0
30 Mar 2022
1