ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1911.11065
  4. Cited By
Knowledge Distillation in Document Retrieval

Knowledge Distillation in Document Retrieval

11 November 2019
Siamak Shakeri
A. Sethy
Cheng Cheng
    FedML
ArXiv (abs)PDFHTML

Papers citing "Knowledge Distillation in Document Retrieval"

2 / 2 papers shown
Multitask Prompt Tuning Enables Parameter-Efficient Transfer Learning
Multitask Prompt Tuning Enables Parameter-Efficient Transfer LearningInternational Conference on Learning Representations (ICLR), 2023
Zhen Wang
Yikang Shen
Leonid Karlinsky
Rogerio Feris
Huan Sun
Yoon Kim
VLMVPVLM
224
151
0
06 Mar 2023
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
2.0K
3,768
0
09 Jun 2020
1
Page 1 of 1