ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1906.07978
25
23

Multilingual Multi-Domain Adaptation Approaches for Neural Machine Translation

19 June 2019
Chenhui Chu
Raj Dabre
    AI4CE
ArXivPDFHTML
Abstract

In this paper, we propose two novel methods for domain adaptation for the attention-only neural machine translation (NMT) model, i.e., the Transformer. Our methods focus on training a single translation model for multiple domains by either learning domain specialized hidden state representations or predictor biases for each domain. We combine our methods with a previously proposed black-box method called mixed fine tuning, which is known to be highly effective for domain adaptation. In addition, we incorporate multilingualism into the domain adaptation framework. Experiments show that multilingual multi-domain adaptation can significantly improve both resource-poor in-domain and resource-rich out-of-domain translations, and the combination of our methods with mixed fine tuning achieves the best performance.

View on arXiv
Comments on this paper