13
33

Triangular Architecture for Rare Language Translation

Abstract

Neural Machine Translation (NMT) performs poor on the low-resource language pair (X,Z)(X,Z), especially when ZZ is a rare language. By introducing another rich language YY, we propose a novel triangular training architecture (TA-NMT) to leverage bilingual data (Y,Z)(Y,Z) (may be small) and (X,Y)(X,Y) (can be rich) to improve the translation performance of low-resource pairs. In this triangular architecture, ZZ is taken as the intermediate latent variable, and translation models of ZZ are jointly optimized with a unified bidirectional EM algorithm under the goal of maximizing the translation likelihood of (X,Y)(X,Y). Empirical results demonstrate that our method significantly improves the translation quality of rare languages on MultiUN and IWSLT2012 datasets, and achieves even better performance combining back-translation methods.

View on arXiv
Comments on this paper