Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2205.07257
Cited By
Not to Overfit or Underfit the Source Domains? An Empirical Study of Domain Generalization in Question Answering
15 May 2022
Md Arafat Sultan
Avirup Sil
Radu Florian
OOD
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Not to Overfit or Underfit the Source Domains? An Empirical Study of Domain Generalization in Question Answering"
6 / 6 papers shown
Title
An Empirical Investigation into the Effect of Parameter Choices in Knowledge Distillation
Md Arafat Sultan
Aashka Trivedi
Parul Awasthy
Avirup Sil
22
0
0
12 Jan 2024
Can LLMs Grade Short-Answer Reading Comprehension Questions : An Empirical Study with a Novel Dataset
Owen Henkel
Libby Hills
Bill Roberts
Joshua A. McGrane
AI4Ed
19
1
0
26 Oct 2023
Knowledge Distillation
≈
\approx
≈
Label Smoothing: Fact or Fallacy?
Md Arafat Sultan
17
2
0
30 Jan 2023
PrimeQA: The Prime Repository for State-of-the-Art Multilingual Question Answering Research and Development
Avirup Sil
Jaydeep Sen
Bhavani Iyer
M. Franz
Kshitij P. Fadnis
...
Yulong Li
Md Arafat Sultan
Riyaz Ahmad Bhat
Radu Florian
Salim Roukos
12
4
0
23 Jan 2023
Improved Synthetic Training for Reading Comprehension
Yanda Chen
Md Arafat Sultan
T. J. W. R. Center
SyDa
11
5
0
24 Oct 2020
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Chelsea Finn
Pieter Abbeel
Sergey Levine
OOD
243
11,659
0
09 Mar 2017
1