Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2306.08756
Cited By
Recipes for Sequential Pre-training of Multilingual Encoder and Seq2Seq Models
14 June 2023
Saleh Soltan
Andrew Rosenbaum
Tobias Falke
Qin Lu
Anna Rumshisky
Wael Hamza
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Recipes for Sequential Pre-training of Multilingual Encoder and Seq2Seq Models"
2 / 2 papers shown
Title
EncT5: A Framework for Fine-tuning T5 as Non-autoregressive Models
Frederick Liu
T. Huang
Shihang Lyu
Siamak Shakeri
Hongkun Yu
Jing Li
31
8
0
16 Oct 2021
Multitask Prompted Training Enables Zero-Shot Task Generalization
Victor Sanh
Albert Webson
Colin Raffel
Stephen H. Bach
Lintang Sutawika
...
T. Bers
Stella Biderman
Leo Gao
Thomas Wolf
Alexander M. Rush
LRM
211
1,654
0
15 Oct 2021
1