ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2103.08647
  4. Cited By
The Effect of Domain and Diacritics in Yorùbá-English Neural Machine
  Translation

The Effect of Domain and Diacritics in Yorùbá-English Neural Machine Translation

15 March 2021
David Ifeoluwa Adelani
Dana Ruiter
Jesujoba Oluwadara Alabi
Damilola Adebonojo
Adesina Ayeni
Mofetoluwa Adeyemi
Ayodele Awokoya
C. España-Bonet
ArXivPDFHTML

Papers citing "The Effect of Domain and Diacritics in Yorùbá-English Neural Machine Translation"

5 / 5 papers shown
Title
Glot500: Scaling Multilingual Corpora and Language Models to 500
  Languages
Glot500: Scaling Multilingual Corpora and Language Models to 500 Languages
Ayyoob Imani
Peiqin Lin
Amir Hossein Kargaran
Silvia Severini
Masoud Jalili Sabet
...
Chunlan Ma
Helmut Schmid
André F. T. Martins
François Yvon
Hinrich Schütze
ALM
LRM
29
95
0
20 May 2023
Fixing MoE Over-Fitting on Low-Resource Languages in Multilingual
  Machine Translation
Fixing MoE Over-Fitting on Low-Resource Languages in Multilingual Machine Translation
Maha Elbayad
Anna Y. Sun
Shruti Bhosale
MoE
41
8
0
15 Dec 2022
Separating Grains from the Chaff: Using Data Filtering to Improve
  Multilingual Translation for Low-Resourced African Languages
Separating Grains from the Chaff: Using Data Filtering to Improve Multilingual Translation for Low-Resourced African Languages
Idris Abdulmumin
Michael Beukman
Jesujoba Oluwadara Alabi
Chris C. Emezue
Everlyn Asiko
...
Shamsuddeen Hassan Muhammad
Mofetoluwa Adeyemi
Oreen Yousuf
Sahib Singh
T. Gwadabe
21
6
0
19 Oct 2022
MMTAfrica: Multilingual Machine Translation for African Languages
MMTAfrica: Multilingual Machine Translation for African Languages
Chris C. Emezue
Bonaventure F. P. Dossou
19
24
0
08 Apr 2022
Pre-Trained Multilingual Sequence-to-Sequence Models: A Hope for
  Low-Resource Language Translation?
Pre-Trained Multilingual Sequence-to-Sequence Models: A Hope for Low-Resource Language Translation?
E. Lee
Sarubi Thillainathan
Shravan Nayak
Surangika Ranathunga
David Ifeoluwa Adelani
Ruisi Su
Arya D. McCarthy
VLM
21
43
0
16 Mar 2022
1