ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2111.11837
  4. Cited By
Focal and Global Knowledge Distillation for Detectors

Focal and Global Knowledge Distillation for Detectors

23 November 2021
Zhendong Yang
Zhe Li
Xiaohu Jiang
Yuan Gong
Zehuan Yuan
Danpei Zhao
Chun Yuan
    FedML
    ObjD
ArXivPDFHTML

Papers citing "Focal and Global Knowledge Distillation for Detectors"

24 / 24 papers shown
Title
FEDS: Feature and Entropy-Based Distillation Strategy for Efficient Learned Image Compression
H. Fu
Jie Liang
Zhenman Fang
Jingning Han
33
0
0
09 Mar 2025
VRM: Knowledge Distillation via Virtual Relation Matching
VRM: Knowledge Distillation via Virtual Relation Matching
W. Zhang
Fei Xie
Weidong Cai
Chao Ma
68
0
0
28 Feb 2025
DiReDi: Distillation and Reverse Distillation for AIoT Applications
DiReDi: Distillation and Reverse Distillation for AIoT Applications
Chen Sun
Qing Tong
Wenshuang Yang
Wenqi Zhang
23
0
0
12 Sep 2024
Relation Modeling and Distillation for Learning with Noisy Labels
Relation Modeling and Distillation for Learning with Noisy Labels
Xiaming Chen
Junlin Zhang
Zhuang Qi
Xin Qi
NoLa
19
0
0
30 May 2024
Active Object Detection with Knowledge Aggregation and Distillation from
  Large Models
Active Object Detection with Knowledge Aggregation and Distillation from Large Models
Dejie Yang
Yang Liu
35
3
0
21 May 2024
CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective
CKD: Contrastive Knowledge Distillation from A Sample-wise Perspective
Wencheng Zhu
Xin Zhou
Pengfei Zhu
Yu Wang
Qinghua Hu
VLM
53
1
0
22 Apr 2024
MergeNet: Knowledge Migration across Heterogeneous Models, Tasks, and
  Modalities
MergeNet: Knowledge Migration across Heterogeneous Models, Tasks, and Modalities
Kunxi Li
Tianyu Zhan
Kairui Fu
Shengyu Zhang
Kun Kuang
Jiwei Li
Zhou Zhao
Fei Wu
MoMe
22
0
0
20 Apr 2024
RadarDistill: Boosting Radar-based Object Detection Performance via Knowledge Distillation from LiDAR Features
RadarDistill: Boosting Radar-based Object Detection Performance via Knowledge Distillation from LiDAR Features
Geonho Bang
Kwangjin Choi
Jisong Kim
Dongsuk Kum
Jun Won Choi
36
13
0
08 Mar 2024
Indirect Gradient Matching for Adversarial Robust Distillation
Indirect Gradient Matching for Adversarial Robust Distillation
Hongsin Lee
Seungju Cho
Changick Kim
AAML
FedML
39
2
0
06 Dec 2023
DistillBEV: Boosting Multi-Camera 3D Object Detection with Cross-Modal
  Knowledge Distillation
DistillBEV: Boosting Multi-Camera 3D Object Detection with Cross-Modal Knowledge Distillation
Zeyu Wang
Dingwen Li
Chenxu Luo
Cihang Xie
Xiaodong Yang
27
23
0
26 Sep 2023
PanoSwin: a Pano-style Swin Transformer for Panorama Understanding
PanoSwin: a Pano-style Swin Transformer for Panorama Understanding
Zhixin Ling
Zhen Xing
Xiangdong Zhou
Manliang Cao
G. Zhou
ViT
13
17
0
28 Aug 2023
Quantized Feature Distillation for Network Quantization
Quantized Feature Distillation for Network Quantization
Kevin Zhu
Yin He
Jianxin Wu
MQ
19
8
0
20 Jul 2023
CrossKD: Cross-Head Knowledge Distillation for Object Detection
CrossKD: Cross-Head Knowledge Distillation for Object Detection
Jiabao Wang
Yuming Chen
Zhaohui Zheng
Xiang Li
Ming-Ming Cheng
Qibin Hou
38
30
0
20 Jun 2023
Knowledge Diffusion for Distillation
Knowledge Diffusion for Distillation
Tao Huang
Yuan Zhang
Mingkai Zheng
Shan You
Fei Wang
Chao Qian
Chang Xu
29
48
0
25 May 2023
Geometric-aware Pretraining for Vision-centric 3D Object Detection
Geometric-aware Pretraining for Vision-centric 3D Object Detection
Linyan Huang
Huijie Wang
J. Zeng
Shengchuan Zhang
Liujuan Cao
Junchi Yan
Hongyang Li
3DPC
57
9
0
06 Apr 2023
DAMO-StreamNet: Optimizing Streaming Perception in Autonomous Driving
DAMO-StreamNet: Optimizing Streaming Perception in Autonomous Driving
Ju He
Zhi-Qi Cheng
Chenyang Li
Wangmeng Xiang
Binghui Chen
Bin Luo
Yifeng Geng
Xuansong Xie
AI4CE
14
19
0
30 Mar 2023
From Knowledge Distillation to Self-Knowledge Distillation: A Unified
  Approach with Normalized Loss and Customized Soft Labels
From Knowledge Distillation to Self-Knowledge Distillation: A Unified Approach with Normalized Loss and Customized Soft Labels
Zhendong Yang
Ailing Zeng
Zhe Li
Tianke Zhang
Chun Yuan
Yu Li
11
70
0
23 Mar 2023
X$^3$KD: Knowledge Distillation Across Modalities, Tasks and Stages for
  Multi-Camera 3D Object Detection
X3^33KD: Knowledge Distillation Across Modalities, Tasks and Stages for Multi-Camera 3D Object Detection
Marvin Klingner
Shubhankar Borse
V. Kumar
B. Rezaei
V. Narayanan
S. Yogamani
Fatih Porikli
29
21
0
03 Mar 2023
Privileged Prior Information Distillation for Image Matting
Privileged Prior Information Distillation for Image Matting
Chengzhi Lyu
Jiake Xie
Bo Xu
Cheng Lu
Han Huang
Xin Huang
Ming Wu
Chuang Zhang
Yong Tang
13
1
0
25 Nov 2022
Rethinking Knowledge Distillation via Cross-Entropy
Rethinking Knowledge Distillation via Cross-Entropy
Zhendong Yang
Zhe Li
Yuan Gong
Tianke Zhang
Shanshan Lao
Chun Yuan
Yu Li
25
14
0
22 Aug 2022
Task-Balanced Distillation for Object Detection
Task-Balanced Distillation for Object Detection
Ruining Tang
Zhen-yu Liu
Yangguang Li
Yiguo Song
Hui Liu
Qide Wang
Jing Shao
Guifang Duan
Jianrong Tan
19
20
0
05 Aug 2022
Towards Efficient 3D Object Detection with Knowledge Distillation
Towards Efficient 3D Object Detection with Knowledge Distillation
Jihan Yang
Shaoshuai Shi
Runyu Ding
Zhe Wang
Xiaojuan Qi
102
45
0
30 May 2022
Masked Generative Distillation
Masked Generative Distillation
Zhendong Yang
Zhe Li
Mingqi Shao
Dachuan Shi
Zehuan Yuan
Chun Yuan
FedML
19
168
0
03 May 2022
Aggregated Residual Transformations for Deep Neural Networks
Aggregated Residual Transformations for Deep Neural Networks
Saining Xie
Ross B. Girshick
Piotr Dollár
Z. Tu
Kaiming He
261
10,106
0
16 Nov 2016
1