ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1907.03060
  4. Cited By
Exploiting Out-of-Domain Parallel Data through Multilingual Transfer
  Learning for Low-Resource Neural Machine Translation

Exploiting Out-of-Domain Parallel Data through Multilingual Transfer Learning for Low-Resource Neural Machine Translation

6 July 2019
Aizhan Imankulova
Raj Dabre
Atsushi Fujita
K. Imamura
ArXivPDFHTML

Papers citing "Exploiting Out-of-Domain Parallel Data through Multilingual Transfer Learning for Low-Resource Neural Machine Translation"

13 / 13 papers shown
Title
adaptMLLM: Fine-Tuning Multilingual Language Models on Low-Resource
  Languages with Integrated LLM Playgrounds
adaptMLLM: Fine-Tuning Multilingual Language Models on Low-Resource Languages with Integrated LLM Playgrounds
Séamus Lankford
Haithem Afli
Andy Way
27
27
0
04 Mar 2024
Human Evaluation of English--Irish Transformer-Based NMT
Human Evaluation of English--Irish Transformer-Based NMT
Séamus Lankford
Haithem Afli
Andy Way
19
10
0
04 Mar 2024
CreoleVal: Multilingual Multitask Benchmarks for Creoles
CreoleVal: Multilingual Multitask Benchmarks for Creoles
Heather Lent
Kushal Tatariya
Raj Dabre
Yiyi Chen
Marcell Richard Fekete
...
Miryam de Lhoneux
Daniel Hershcovich
Michel DeGraff
Anders Sogaard
Johannes Bjerva
SLR
33
9
0
30 Oct 2023
Leveraging Auxiliary Domain Parallel Data in Intermediate Task
  Fine-tuning for Low-resource Translation
Leveraging Auxiliary Domain Parallel Data in Intermediate Task Fine-tuning for Low-resource Translation
Shravan Nayak
Surangika Ranathunga
Sarubi Thillainathan
Rikki Hung
Anthony Rinaldi
Yining Wang
Jonah Mackey
Andrew Ho
E. Lee
13
3
0
02 Jun 2023
Learning to Parallelize in a Shared-Memory Environment with Transformers
Learning to Parallelize in a Shared-Memory Environment with Transformers
Reém Harel
Yuval Pinter
Gal Oren
37
17
0
27 Apr 2022
Attentive fine-tuning of Transformers for Translation of low-resourced
  languages @LoResMT 2021
Attentive fine-tuning of Transformers for Translation of low-resourced languages @LoResMT 2021
Karthik Puranik
Adeep Hande
R. Priyadharshini
Thenmozi Durairaj
Anbukkarasi Sampath
K. Thamburaj
Bharathi Raja Chakravarthi
15
9
0
19 Aug 2021
Neural Machine Translation for Low-Resource Languages: A Survey
Neural Machine Translation for Low-Resource Languages: A Survey
Surangika Ranathunga
E. Lee
Marjana Prifti Skenduli
Ravi Shekhar
Mehreen Alam
Rishemjit Kaur
25
233
0
29 Jun 2021
Continual Mixed-Language Pre-Training for Extremely Low-Resource Neural
  Machine Translation
Continual Mixed-Language Pre-Training for Extremely Low-Resource Neural Machine Translation
Zihan Liu
Genta Indra Winata
Pascale Fung
VLM
CLL
30
49
0
09 May 2021
Gradual Fine-Tuning for Low-Resource Domain Adaptation
Gradual Fine-Tuning for Low-Resource Domain Adaptation
Haoran Xu
Seth Ebner
M. Yarmohammadi
A. White
Benjamin Van Durme
Kenton W. Murray
CLL
14
39
0
03 Mar 2021
JASS: Japanese-specific Sequence to Sequence Pre-training for Neural
  Machine Translation
JASS: Japanese-specific Sequence to Sequence Pre-training for Neural Machine Translation
Zhuoyuan Mao
Fabien Cromierès
Raj Dabre
Haiyue Song
Sadao Kurohashi
10
4
0
07 May 2020
Coursera Corpus Mining and Multistage Fine-Tuning for Improving Lectures
  Translation
Coursera Corpus Mining and Multistage Fine-Tuning for Improving Lectures Translation
Haiyue Song
Raj Dabre
Atsushi Fujita
Sadao Kurohashi
16
4
0
26 Dec 2019
Six Challenges for Neural Machine Translation
Six Challenges for Neural Machine Translation
Philipp Koehn
Rebecca Knowles
AAML
AIMat
208
1,202
0
12 Jun 2017
Multi-Way, Multilingual Neural Machine Translation with a Shared
  Attention Mechanism
Multi-Way, Multilingual Neural Machine Translation with a Shared Attention Mechanism
Orhan Firat
Kyunghyun Cho
Yoshua Bengio
LRM
AIMat
206
622
0
06 Jan 2016
1