Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
1911.11065
Cited By
Knowledge Distillation in Document Retrieval
11 November 2019
Siamak Shakeri
A. Sethy
Cheng Cheng
FedML
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Knowledge Distillation in Document Retrieval"
2 / 2 papers shown
Multitask Prompt Tuning Enables Parameter-Efficient Transfer Learning
International Conference on Learning Representations (ICLR), 2023
Zhen Wang
Yikang Shen
Leonid Karlinsky
Rogerio Feris
Huan Sun
Yoon Kim
VLM
VPVLM
224
151
0
06 Mar 2023
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
2.0K
3,768
0
09 Jun 2020
1
Page 1 of 1