Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2110.06609
Cited By
MSP: Multi-Stage Prompting for Making Pre-trained Language Models Better Translators
13 October 2021
Zhixing Tan
Xiangwen Zhang
Shuo Wang
Yang Liu
VLM
LRM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"MSP: Multi-Stage Prompting for Making Pre-trained Language Models Better Translators"
5 / 5 papers shown
Title
Multilingual Translation via Grafting Pre-trained Language Models
Zewei Sun
Mingxuan Wang
Lei Li
AI4CE
152
19
0
11 Sep 2021
The Power of Scale for Parameter-Efficient Prompt Tuning
Brian Lester
Rami Al-Rfou
Noah Constant
VPVLM
257
2,999
0
18 Apr 2021
Making Pre-trained Language Models Better Few-shot Learners
Tianyu Gao
Adam Fisch
Danqi Chen
225
1,649
0
31 Dec 2020
Megatron-LM: Training Multi-Billion Parameter Language Models Using Model Parallelism
M. Shoeybi
M. Patwary
Raul Puri
P. LeGresley
Jared Casper
Bryan Catanzaro
MoE
224
1,436
0
17 Sep 2019
Six Challenges for Neural Machine Translation
Philipp Koehn
Rebecca Knowles
AAML
AIMat
177
1,144
0
12 Jun 2017
1