ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2306.08756
  4. Cited By
Recipes for Sequential Pre-training of Multilingual Encoder and Seq2Seq
  Models

Recipes for Sequential Pre-training of Multilingual Encoder and Seq2Seq Models

14 June 2023
Saleh Soltan
Andrew Rosenbaum
Tobias Falke
Qin Lu
Anna Rumshisky
Wael Hamza
ArXivPDFHTML

Papers citing "Recipes for Sequential Pre-training of Multilingual Encoder and Seq2Seq Models"

2 / 2 papers shown
Title
EncT5: A Framework for Fine-tuning T5 as Non-autoregressive Models
EncT5: A Framework for Fine-tuning T5 as Non-autoregressive Models
Frederick Liu
T. Huang
Shihang Lyu
Siamak Shakeri
Hongkun Yu
Jing Li
31
8
0
16 Oct 2021
Multitask Prompted Training Enables Zero-Shot Task Generalization
Multitask Prompted Training Enables Zero-Shot Task Generalization
Victor Sanh
Albert Webson
Colin Raffel
Stephen H. Bach
Lintang Sutawika
...
T. Bers
Stella Biderman
Leo Gao
Thomas Wolf
Alexander M. Rush
LRM
211
1,654
0
15 Oct 2021
1