Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2011.07164
Cited By
Language Models not just for Pre-training: Fast Online Neural Noisy Channel Modeling
13 November 2020
Shruti Bhosale
Kyra Yee
Sergey Edunov
Michael Auli
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Language Models not just for Pre-training: Fast Online Neural Noisy Channel Modeling"
7 / 7 papers shown
Title
RoCode: A Dataset for Measuring Code Intelligence from Problem Definitions in Romanian
Adrian Cosma
Ioan-Bogdan Iordache
Paolo Rosso
OffRL
33
2
0
20 Feb 2024
Don't Rank, Combine! Combining Machine Translation Hypotheses Using Quality Estimation
Giorgos Vernikos
Andrei Popescu-Belis
30
14
0
12 Jan 2024
Improving Non-autoregressive Translation Quality with Pretrained Language Model, Embedding Distillation and Upsampling Strategy for CTC
Shensian Syu
Jun Xie
Hung-yi Lee
23
0
0
10 Jun 2023
Amortized Noisy Channel Neural Machine Translation
Richard Yuanzhe Pang
He He
Kyunghyun Cho
25
5
0
16 Dec 2021
Survey of Low-Resource Machine Translation
Barry Haddow
Rachel Bawden
Antonio Valerio Miceli Barone
Jindvrich Helcl
Alexandra Birch
AIMat
27
147
0
01 Sep 2021
Facebook AI WMT21 News Translation Task Submission
C. Tran
Shruti Bhosale
James Cross
Philipp Koehn
Sergey Edunov
Angela Fan
VLM
134
80
0
06 Aug 2021
Revisiting Self-Training for Neural Sequence Generation
Junxian He
Jiatao Gu
Jiajun Shen
MarcÁurelio Ranzato
SSL
LRM
242
269
0
30 Sep 2019
1