ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2401.06356
  4. Cited By
An Empirical Investigation into the Effect of Parameter Choices in
  Knowledge Distillation

An Empirical Investigation into the Effect of Parameter Choices in Knowledge Distillation

12 January 2024
Md Arafat Sultan
Aashka Trivedi
Parul Awasthy
Avirup Sil
ArXivPDFHTML

Papers citing "An Empirical Investigation into the Effect of Parameter Choices in Knowledge Distillation"

3 / 3 papers shown
Title
Not to Overfit or Underfit the Source Domains? An Empirical Study of
  Domain Generalization in Question Answering
Not to Overfit or Underfit the Source Domains? An Empirical Study of Domain Generalization in Question Answering
Md Arafat Sultan
Avirup Sil
Radu Florian
OOD
16
6
0
15 May 2022
Cross-Task Knowledge Distillation in Multi-Task Recommendation
Cross-Task Knowledge Distillation in Multi-Task Recommendation
Chenxiao Yang
Junwei Pan
Xiaofeng Gao
Tingyu Jiang
Dapeng Liu
Guihai Chen
34
44
0
20 Feb 2022
MLQA: Evaluating Cross-lingual Extractive Question Answering
MLQA: Evaluating Cross-lingual Extractive Question Answering
Patrick Lewis
Barlas Oğuz
Ruty Rinott
Sebastian Riedel
Holger Schwenk
ELM
242
490
0
16 Oct 2019
1