Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2209.02432
Cited By
ViTKD: Practical Guidelines for ViT feature knowledge distillation
6 September 2022
Zhendong Yang
Zhe Li
Ailing Zeng
Zexian Li
Chun Yuan
Yu Li
Re-assign community
ArXiv
PDF
HTML
Papers citing
"ViTKD: Practical Guidelines for ViT feature knowledge distillation"
7 / 7 papers shown
Title
Continual Distillation Learning: Knowledge Distillation in Prompt-based Continual Learning
Qifan Zhang
Yunhui Guo
Yu Xiang
VLM
CLL
46
0
0
18 Jul 2024
Semi-supervised ViT knowledge distillation network with style transfer normalization for colorectal liver metastases survival prediction
Mohamed El Amine Elforaici
E. Montagnon
Francisco Perdigon Romero
W. Le
F. Azzi
Dominique Trudel
Bich Nguyen
Simon Turcotte
An Tang
Samuel Kadoury
MedIm
18
2
0
17 Nov 2023
From Knowledge Distillation to Self-Knowledge Distillation: A Unified Approach with Normalized Loss and Customized Soft Labels
Zhendong Yang
Ailing Zeng
Zhe Li
Tianke Zhang
Chun Yuan
Yu Li
11
70
0
23 Mar 2023
Masked Autoencoders Are Scalable Vision Learners
Kaiming He
Xinlei Chen
Saining Xie
Yanghao Li
Piotr Dollár
Ross B. Girshick
ViT
TPM
258
7,337
0
11 Nov 2021
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
147
416
0
19 Apr 2021
Transformer in Transformer
Kai Han
An Xiao
Enhua Wu
Jianyuan Guo
Chunjing Xu
Yunhe Wang
ViT
282
1,490
0
27 Feb 2021
Localization Distillation for Dense Object Detection
Zhaohui Zheng
Rongguang Ye
Ping Wang
Dongwei Ren
W. Zuo
Qibin Hou
Ming-Ming Cheng
ObjD
96
111
0
24 Feb 2021
1