ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1901.00366
  4. Cited By
Learning Efficient Detector with Semi-supervised Adaptive Distillation
v1v2 (latest)

Learning Efficient Detector with Semi-supervised Adaptive Distillation

2 January 2019
Shitao Tang
Xue Jiang
Wenqi Shao
Zhanghui Kuang
Wayne Zhang
Yimin Chen
ArXiv (abs)PDFHTMLGithub (59★)

Papers citing "Learning Efficient Detector with Semi-supervised Adaptive Distillation"

10 / 10 papers shown
Adaptive Self-Distillation for Minimizing Client Drift in Heterogeneous Federated Learning
Adaptive Self-Distillation for Minimizing Client Drift in Heterogeneous Federated Learning
M.Yashwanth
Gaurav Kumar Nayak
Aryaveer Singh
Yogesh Singh
Anirban Chakraborty
FedML
584
1
0
31 May 2023
AMD: Adaptive Masked Distillation for Object Detection
AMD: Adaptive Masked Distillation for Object DetectionIEEE International Joint Conference on Neural Network (IJCNN), 2023
Guang-hong Yang
Yin Tang
Jun Li
Jianhua Xu
Xili Wan
131
9
0
31 Jan 2023
Dynamic Contrastive Distillation for Image-Text Retrieval
Dynamic Contrastive Distillation for Image-Text RetrievalIEEE transactions on multimedia (IEEE TMM), 2022
Jun Rao
Liang Ding
Shuhan Qi
Meng Fang
Yang Liu
Liqiong Shen
Dacheng Tao
VLM
177
40
0
04 Jul 2022
Distillation from heterogeneous unlabeled collections
Distillation from heterogeneous unlabeled collections
Jean-Michel Begon
Pierre Geurts
119
0
0
17 Jan 2022
Teacher's pet: understanding and mitigating biases in distillation
Teacher's pet: understanding and mitigating biases in distillation
Michal Lukasik
Srinadh Bhojanapalli
A. Menon
Sanjiv Kumar
218
29
0
19 Jun 2021
WSSOD: A New Pipeline for Weakly- and Semi-Supervised Object Detection
WSSOD: A New Pipeline for Weakly- and Semi-Supervised Object Detection
Shijie Fang
Yuhang Cao
Xinjiang Wang
Kai-xiang Chen
Dahua Lin
Wayne Zhang
178
9
0
21 May 2021
Distilling Object Detectors via Decoupled Features
Distilling Object Detectors via Decoupled FeaturesComputer Vision and Pattern Recognition (CVPR), 2021
Jianyuan Guo
Kai Han
Yunhe Wang
Han Wu
Xinghao Chen
Chunjing Xu
Chang Xu
259
242
0
26 Mar 2021
Rethinking Soft Labels for Knowledge Distillation: A Bias-Variance
  Tradeoff Perspective
Rethinking Soft Labels for Knowledge Distillation: A Bias-Variance Tradeoff PerspectiveInternational Conference on Learning Representations (ICLR), 2021
Helong Zhou
Liangchen Song
Jiajie Chen
Ye Zhou
Guoli Wang
Junsong Yuan
Qian Zhang
367
201
0
01 Feb 2021
aw_nas: A Modularized and Extensible NAS framework
aw_nas: A Modularized and Extensible NAS framework
Xuefei Ning
Changcheng Tang
Wenshuo Li
Songyi Yang
Tianchen Zhao
Niansong Zhang
Tianyi Lu
Shuang Liang
Huazhong Yang
Yu Wang
219
6
0
25 Nov 2020
Improving Route Choice Models by Incorporating Contextual Factors via
  Knowledge Distillation
Improving Route Choice Models by Incorporating Contextual Factors via Knowledge Distillation
Qun Liu
S. Mukhopadhyay
Yimin Zhu
Ravindra Gudishala
Sanaz Saeidi
Alimire Nabijiang
86
7
0
27 Mar 2019
1
Page 1 of 1