Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2109.03537
Cited By
On the Transferability of Pre-trained Language Models: A Study from Artificial Datasets
8 September 2021
Cheng-Han Chiang
Hung-yi Lee
SyDa
Re-assign community
ArXiv
PDF
HTML
Papers citing
"On the Transferability of Pre-trained Language Models: A Study from Artificial Datasets"
7 / 7 papers shown
Title
SPEC: Summary Preference Decomposition for Low-Resource Abstractive Summarization
Yi-Syuan Chen
Yun-Zhu Song
Hong-Han Shuai
33
6
0
24 Mar 2023
On the Effect of Pre-training for Transformer in Different Modality on Offline Reinforcement Learning
S. Takagi
OffRL
18
7
0
17 Nov 2022
Intermediate Fine-Tuning Using Imperfect Synthetic Speech for Improving Electrolaryngeal Speech Recognition
Lester Phillip Violeta
D. Ma
Wen-Chin Huang
T. Toda
19
7
0
02 Nov 2022
Robustness of Demonstration-based Learning Under Limited Data Scenario
Hongxin Zhang
Yanzhe Zhang
Ruiyi Zhang
Diyi Yang
34
13
0
19 Oct 2022
Insights into Pre-training via Simpler Synthetic Tasks
Yuhuai Wu
Felix Li
Percy Liang
AIMat
26
20
0
21 Jun 2022
Pretraining with Artificial Language: Studying Transferable Knowledge in Language Models
Ryokan Ri
Yoshimasa Tsuruoka
26
25
0
19 Mar 2022
GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding
Alex Jinpeng Wang
Amanpreet Singh
Julian Michael
Felix Hill
Omer Levy
Samuel R. Bowman
ELM
297
6,956
0
20 Apr 2018
1