ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2210.15274
  4. Cited By
Improved Feature Distillation via Projector Ensemble

Improved Feature Distillation via Projector Ensemble

27 October 2022
Yudong Chen
Sen Wang
Jiajun Liu
Xuwei Xu
Frank de Hoog
Zi Huang
ArXivPDFHTML

Papers citing "Improved Feature Distillation via Projector Ensemble"

25 / 25 papers shown
Title
MoKD: Multi-Task Optimization for Knowledge Distillation
MoKD: Multi-Task Optimization for Knowledge Distillation
Zeeshan Hayder
A. Cheraghian
Lars Petersson
Mehrtash Harandi
VLM
47
0
0
13 May 2025
VRM: Knowledge Distillation via Virtual Relation Matching
VRM: Knowledge Distillation via Virtual Relation Matching
W. Zhang
Fei Xie
Weidong Cai
Chao Ma
71
0
0
28 Feb 2025
Learning to Generalize without Bias for Open-Vocabulary Action Recognition
Learning to Generalize without Bias for Open-Vocabulary Action Recognition
Yating Yu
Congqi Cao
Yifan Zhang
Yanning Zhang
VLM
41
0
0
27 Feb 2025
Cross-View Consistency Regularisation for Knowledge Distillation
Cross-View Consistency Regularisation for Knowledge Distillation
W. Zhang
Dongnan Liu
Weidong Cai
Chao Ma
68
1
0
21 Dec 2024
Wasserstein Distance Rivals Kullback-Leibler Divergence for Knowledge
  Distillation
Wasserstein Distance Rivals Kullback-Leibler Divergence for Knowledge Distillation
Jiaming Lv
Haoyuan Yang
P. Li
74
1
0
11 Dec 2024
Cross-Domain Knowledge Distillation for Low-Resolution Human Pose
  Estimation
Cross-Domain Knowledge Distillation for Low-Resolution Human Pose Estimation
Zejun Gu
Zhongming Zhao
Henghui Ding
Hao Shen
Zhao Zhang
De-Shuang Huang
32
0
0
19 May 2024
Theoretical Bound-Guided Hierarchical VAE for Neural Image Codecs
Theoretical Bound-Guided Hierarchical VAE for Neural Image Codecs
Yichi Zhang
Zhihao Duan
Yuning Huang
F. Zhu
16
4
0
27 Mar 2024
Learning to Project for Cross-Task Knowledge Distillation
Learning to Project for Cross-Task Knowledge Distillation
Dylan Auty
Roy Miles
Benedikt Kolbeinsson
K. Mikolajczyk
40
0
0
21 Mar 2024
$V_kD:$ Improving Knowledge Distillation using Orthogonal Projections
VkD:V_kD:Vk​D: Improving Knowledge Distillation using Orthogonal Projections
Roy Miles
Ismail Elezi
Jiankang Deng
44
10
0
10 Mar 2024
FROSTER: Frozen CLIP Is A Strong Teacher for Open-Vocabulary Action
  Recognition
FROSTER: Frozen CLIP Is A Strong Teacher for Open-Vocabulary Action Recognition
Xiaohui Huang
Hao Zhou
Kun Yao
Kai Han
VLM
49
19
0
05 Feb 2024
Self-supervised Video Object Segmentation with Distillation Learning of
  Deformable Attention
Self-supervised Video Object Segmentation with Distillation Learning of Deformable Attention
Quang-Trung Truong
Duc Thanh Nguyen
Binh-Son Hua
Sai-Kit Yeung
VOS
34
1
0
25 Jan 2024
Understanding the Effects of Projectors in Knowledge Distillation
Understanding the Effects of Projectors in Knowledge Distillation
Yudong Chen
Sen Wang
Jiajun Liu
Xuwei Xu
Frank de Hoog
Brano Kusy
Zi Huang
19
0
0
26 Oct 2023
Debiasing, calibrating, and improving Semi-supervised Learning
  performance via simple Ensemble Projector
Debiasing, calibrating, and improving Semi-supervised Learning performance via simple Ensemble Projector
Khanh-Binh Nguyen
25
2
0
24 Oct 2023
Bidirectional Knowledge Reconfiguration for Lightweight Point Cloud
  Analysis
Bidirectional Knowledge Reconfiguration for Lightweight Point Cloud Analysis
Peipei Li
Xing Cui
Yibo Hu
Man Zhang
Ting Yao
Tao Mei
25
0
0
08 Oct 2023
VanillaKD: Revisit the Power of Vanilla Knowledge Distillation from
  Small Scale to Large Scale
VanillaKD: Revisit the Power of Vanilla Knowledge Distillation from Small Scale to Large Scale
Zhiwei Hao
Jianyuan Guo
Kai Han
Han Hu
Chang Xu
Yunhe Wang
30
16
0
25 May 2023
Temporal Contrastive Learning for Spiking Neural Networks
Temporal Contrastive Learning for Spiking Neural Networks
Haonan Qiu
Zeyin Song
Yanqing Chen
Munan Ning
Wei Fang
Tao Sun
Zhengyu Ma
Liuliang Yuan
Yonghong Tian
11
2
0
23 May 2023
Is Synthetic Data From Diffusion Models Ready for Knowledge
  Distillation?
Is Synthetic Data From Diffusion Models Ready for Knowledge Distillation?
Zheng Li
Yuxuan Li
Penghai Zhao
Renjie Song
Xiang Li
Jian Yang
29
19
0
22 May 2023
Understanding the Role of the Projector in Knowledge Distillation
Understanding the Role of the Projector in Knowledge Distillation
Roy Miles
K. Mikolajczyk
14
21
0
20 Mar 2023
Reconstructing Pruned Filters using Cheap Spatial Transformations
Reconstructing Pruned Filters using Cheap Spatial Transformations
Roy Miles
K. Mikolajczyk
13
0
0
25 Oct 2021
Distilling Knowledge via Knowledge Review
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
147
419
0
19 Apr 2021
Show, Attend and Distill:Knowledge Distillation via Attention-based
  Feature Matching
Show, Attend and Distill:Knowledge Distillation via Attention-based Feature Matching
Mingi Ji
Byeongho Heo
Sungrae Park
59
143
0
05 Feb 2021
Semantics Disentangling for Generalized Zero-Shot Learning
Semantics Disentangling for Generalized Zero-Shot Learning
Zhi Chen
Yadan Luo
Ruihong Qiu
Sen Wang
Zi Huang
Jingjing Li
Zheng-Wei Zhang
70
99
0
20 Jan 2021
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision
  Applications
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Andrew G. Howard
Menglong Zhu
Bo Chen
Dmitry Kalenichenko
Weijun Wang
Tobias Weyand
M. Andreetto
Hartwig Adam
3DH
948
20,549
0
17 Apr 2017
Densely Connected Convolutional Networks
Densely Connected Convolutional Networks
Gao Huang
Zhuang Liu
L. V. D. van der Maaten
Kilian Q. Weinberger
PINN
3DV
247
36,356
0
25 Aug 2016
ImageNet Large Scale Visual Recognition Challenge
ImageNet Large Scale Visual Recognition Challenge
Olga Russakovsky
Jia Deng
Hao Su
J. Krause
S. Satheesh
...
A. Karpathy
A. Khosla
Michael S. Bernstein
Alexander C. Berg
Li Fei-Fei
VLM
ObjD
282
39,190
0
01 Sep 2014
1