Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2310.00369
Cited By
Distilling Inductive Bias: Knowledge Distillation Beyond Model Compression
30 September 2023
Gousia Habib
Tausifa Jan Saleem
Brejesh Lall
VLM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Distilling Inductive Bias: Knowledge Distillation Beyond Model Compression"
2 / 2 papers shown
Title
MobileViT: Light-weight, General-purpose, and Mobile-friendly Vision Transformer
Sachin Mehta
Mohammad Rastegari
ViT
189
1,210
0
05 Oct 2021
Transformers in Vision: A Survey
Salman Khan
Muzammal Naseer
Munawar Hayat
Syed Waqas Zamir
F. Khan
M. Shah
ViT
225
2,428
0
04 Jan 2021
1