ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.02463
32
0

Data Augmentation With Back translation for Low Resource languages: A case of English and Luganda

5 May 2025
Richard Kimera
DongNyeong Heo
Daniela N. Rim
Heeyoul Choi
ArXivPDFHTML
Abstract

In this paper,we explore the application of Back translation (BT) as a semi-supervised technique to enhance Neural Machine Translation(NMT) models for the English-Luganda language pair, specifically addressing the challenges faced by low-resource languages. The purpose of our study is to demonstrate how BT can mitigate the scarcity of bilingual data by generating synthetic data from monolingual corpora. Our methodology involves developing custom NMT models using both publicly available and web-crawled data, and applying Iterative and Incremental Back translation techniques. We strategically select datasets for incremental back translation across multiple small datasets, which is a novel element of our approach. The results of our study show significant improvements, with translation performance for the English-Luganda pair exceeding previous benchmarks by more than 10 BLEU score units across all translation directions. Additionally, our evaluation incorporates comprehensive assessment metrics such as SacreBLEU, ChrF2, and TER, providing a nuanced understanding of translation quality. The conclusion drawn from our research confirms the efficacy of BT when strategically curated datasets are utilized, establishing new performance benchmarks and demonstrating the potential of BT in enhancing NMT models for low-resource languages.

View on arXiv
@article{kimera2025_2505.02463,
  title={ Data Augmentation With Back translation for Low Resource languages: A case of English and Luganda },
  author={ Richard Kimera and Dongnyeong Heo and Daniela N. Rim and Heeyoul Choi },
  journal={arXiv preprint arXiv:2505.02463},
  year={ 2025 }
}
Comments on this paper