Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2311.12079
Cited By
FreeKD: Knowledge Distillation via Semantic Frequency Prompt
20 November 2023
Yuan Zhang
Tao Huang
Jiaming Liu
Tao Jiang
Kuan Cheng
Shanghang Zhang
AAML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"FreeKD: Knowledge Distillation via Semantic Frequency Prompt"
5 / 5 papers shown
Title
MoLe-VLA: Dynamic Layer-skipping Vision Language Action Model via Mixture-of-Layers for Efficient Robot Manipulation
Rongyu Zhang
Menghang Dong
Yuan Zhang
Liang Heng
Xiaowei Chi
Gaole Dai
Li Du
Dan Wang
Yuan Du
MoE
81
0
0
26 Mar 2025
Vision Foundation Models in Medical Image Analysis: Advances and Challenges
Pengchen Liang
Bin Pu
Haishan Huang
Yiwei Li
H. Wang
Weibo Ma
Qing Chang
VLM
MedIm
99
0
0
24 Feb 2025
Avatar Knowledge Distillation: Self-ensemble Teacher Paradigm with Uncertainty
Yuan Zhang
Weihua Chen
Yichen Lu
Tao Huang
Xiuyu Sun
Jian Cao
50
8
0
04 May 2023
Feature Pyramid Networks for Object Detection
Tsung-Yi Lin
Piotr Dollár
Ross B. Girshick
Kaiming He
Bharath Hariharan
Serge J. Belongie
ObjD
166
21,643
0
09 Dec 2016
Aggregated Residual Transformations for Deep Neural Networks
Saining Xie
Ross B. Girshick
Piotr Dollár
Z. Tu
Kaiming He
261
10,106
0
16 Nov 2016
1