Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2407.06504
Cited By
Reprogramming Distillation for Medical Foundation Models
9 July 2024
Yuhang Zhou
Siyuan Du
Haolin Li
Jiangchao Yao
Ya Zhang
Yanfeng Wang
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Reprogramming Distillation for Medical Foundation Models"
6 / 6 papers shown
Title
MeLo: Low-rank Adaptation is Better than Fine-tuning for Medical Image Diagnosis
Yitao Zhu
Zhenrong Shen
Zihao Zhao
Sheng Wang
Xin Wang
Xiangyu Zhao
Dinggang Shen
Qian Wang
MedIm
27
13
0
14 Nov 2023
NORM: Knowledge Distillation via N-to-One Representation Matching
Xiaolong Liu
Lujun Li
Chao Li
Anbang Yao
31
66
0
23 May 2023
LPT: Long-tailed Prompt Tuning for Image Classification
Bowen Dong
Pan Zhou
Shuicheng Yan
W. Zuo
VPVLM
VLM
41
52
0
03 Oct 2022
AdaptFormer: Adapting Vision Transformers for Scalable Visual Recognition
Shoufa Chen
Chongjian Ge
Zhan Tong
Jiangliu Wang
Yibing Song
Jue Wang
Ping Luo
138
631
0
26 May 2022
Model Reprogramming: Resource-Efficient Cross-Domain Machine Learning
Pin-Yu Chen
VLM
87
57
0
22 Feb 2022
Masked Autoencoders Are Scalable Vision Learners
Kaiming He
Xinlei Chen
Saining Xie
Yanghao Li
Piotr Dollár
Ross B. Girshick
ViT
TPM
258
7,337
0
11 Nov 2021
1