ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2211.08071
  4. Cited By
Knowledge Distillation for Detection Transformer with Consistent
  Distillation Points Sampling

Knowledge Distillation for Detection Transformer with Consistent Distillation Points Sampling

15 November 2022
Yu Wang
Xin Li
Shengzhao Wen
Fu-En Yang
Wanping Zhang
Gang Zhang
Haocheng Feng
Junyu Han
Errui Ding
ArXivPDFHTML

Papers citing "Knowledge Distillation for Detection Transformer with Consistent Distillation Points Sampling"

3 / 3 papers shown
Title
DAB-DETR: Dynamic Anchor Boxes are Better Queries for DETR
DAB-DETR: Dynamic Anchor Boxes are Better Queries for DETR
Shilong Liu
Feng Li
Hao Zhang
X. Yang
Xianbiao Qi
Hang Su
Jun Zhu
Lei Zhang
ViT
132
703
0
28 Jan 2022
Simple Copy-Paste is a Strong Data Augmentation Method for Instance
  Segmentation
Simple Copy-Paste is a Strong Data Augmentation Method for Instance Segmentation
Golnaz Ghiasi
Yin Cui
A. Srinivas
Rui Qian
Tsung-Yi Lin
E. D. Cubuk
Quoc V. Le
Barret Zoph
ISeg
223
962
0
13 Dec 2020
Meta R-CNN : Towards General Solver for Instance-level Few-shot Learning
Meta R-CNN : Towards General Solver for Instance-level Few-shot Learning
Xiaopeng Yan
Ziliang Chen
Anni Xu
Xiaoxi Wang
Xiaodan Liang
Liang Lin
ObjD
151
440
0
28 Sep 2019
1