Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2311.05778
Cited By
DONUT-hole: DONUT Sparsification by Harnessing Knowledge and Optimizing Learning Efficiency
9 November 2023
Azhar Shaikh
Michael Cochez
Denis Diachkov
Michiel de Rijcke
Sahar Yousefi
Re-assign community
ArXiv
PDF
HTML
Papers citing
"DONUT-hole: DONUT Sparsification by Harnessing Knowledge and Optimizing Learning Efficiency"
3 / 3 papers shown
Title
TransKD: Transformer Knowledge Distillation for Efficient Semantic Segmentation
R. Liu
Kailun Yang
Alina Roitberg
Jiaming Zhang
Kunyu Peng
Huayao Liu
Yaonan Wang
Rainer Stiefelhagen
ViT
34
36
0
27 Feb 2022
Distilling Knowledge via Knowledge Review
Pengguang Chen
Shu-Lin Liu
Hengshuang Zhao
Jiaya Jia
147
416
0
19 Apr 2021
LayoutLMv2: Multi-modal Pre-training for Visually-Rich Document Understanding
Yang Xu
Yiheng Xu
Tengchao Lv
Lei Cui
Furu Wei
...
D. Florêncio
Cha Zhang
Wanxiang Che
Min Zhang
Lidong Zhou
ViT
MLLM
145
498
0
29 Dec 2020
1