Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2105.12900
Cited By
How Does Distilled Data Complexity Impact the Quality and Confidence of Non-Autoregressive Machine Translation?
27 May 2021
Weijia Xu
Shuming Ma
Dongdong Zhang
Marine Carpuat
Re-assign community
ArXiv
PDF
HTML
Papers citing
"How Does Distilled Data Complexity Impact the Quality and Confidence of Non-Autoregressive Machine Translation?"
5 / 5 papers shown
Title
Falcon: Faster and Parallel Inference of Large Language Models through Enhanced Semi-Autoregressive Drafting and Custom-Designed Decoding Tree
Xiangxiang Gao
Weisheng Xie
Yiwei Xiang
Feng Ji
82
5
0
17 Dec 2024
Sentence-Level or Token-Level? A Comprehensive Study on Knowledge Distillation
Jingxuan Wei
Linzhuang Sun
Yichong Leng
Xu Tan
Bihui Yu
Ruifeng Guo
45
3
0
23 Apr 2024
A baseline revisited: Pushing the limits of multi-segment models for context-aware translation
Suvodeep Majumde
Stanislas Lauly
Maria Nadejde
Marcello Federico
Georgiana Dinu
30
13
0
19 Oct 2022
Can Multilinguality benefit Non-autoregressive Machine Translation?
Sweta Agrawal
Julia Kreutzer
Colin Cherry
AI4CE
27
1
0
16 Dec 2021
Understanding and Improving Lexical Choice in Non-Autoregressive Translation
Liang Ding
Longyue Wang
Xuebo Liu
Derek F. Wong
Dacheng Tao
Zhaopeng Tu
96
77
0
29 Dec 2020
1