The NiuTrans Machine Translation Systems for WMT21
Yuhao Zhang
Tao Zhou
Bin Wei
Runzhe Cao
Yongyu Mu
Shuhan Zhou
Ziyang Wang
Xuanjun Zhou
Abudurexiti Reheman
Xin Zeng
Laohu Wang
Jingnan Zhang
Xiaoqian Liu
Weiqiao Shan
Yinqiao Li
Bei Li
Tong Xiao
Jingbo Zhu

Abstract
This paper describes NiuTrans neural machine translation systems of the WMT 2021 news translation tasks. We made submissions to 9 language directions, including EnglishChinese, Japanese, Russian, Icelandic and EnglishHausa tasks. Our primary systems are built on several effective variants of Transformer, e.g., Transformer-DLCL, ODE-Transformer. We also utilize back-translation, knowledge distillation, post-ensemble, and iterative fine-tuning techniques to enhance the model performance further.
View on arXivComments on this paper