ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2302.14220
  4. Cited By
Are Character-level Translations Worth the Wait? Comparing ByT5 and mT5
  for Machine Translation

Are Character-level Translations Worth the Wait? Comparing ByT5 and mT5 for Machine Translation

28 February 2023
Lukas Edman
Gabriele Sarti
Antonio Toral
Gertjan van Noord
Arianna Bisazza
ArXivPDFHTML

Papers citing "Are Character-level Translations Worth the Wait? Comparing ByT5 and mT5 for Machine Translation"

7 / 7 papers shown
Title
MoCE: Adaptive Mixture of Contextualization Experts for Byte-based Neural Machine Translation
MoCE: Adaptive Mixture of Contextualization Experts for Byte-based Neural Machine Translation
Langlin Huang
Mengyu Bu
Yang Feng
18
0
0
03 Nov 2024
Can LLMs Really Learn to Translate a Low-Resource Language from One Grammar Book?
Can LLMs Really Learn to Translate a Low-Resource Language from One Grammar Book?
Seth Aycock
David Stap
Di Wu
Christof Monz
Khalil Simaán
26
2
0
27 Sep 2024
"Will You Find These Shortcuts?" A Protocol for Evaluating the
  Faithfulness of Input Salience Methods for Text Classification
"Will You Find These Shortcuts?" A Protocol for Evaluating the Faithfulness of Input Salience Methods for Text Classification
Jasmijn Bastings
Sebastian Ebert
Polina Zablotskaia
Anders Sandholm
Katja Filippova
99
75
0
14 Nov 2021
Why don't people use character-level machine translation?
Why don't people use character-level machine translation?
Jindrich Libovický
Helmut Schmid
Alexander M. Fraser
51
25
0
15 Oct 2021
Analyzing the Use of Character-Level Translation with Sparse and Noisy
  Datasets
Analyzing the Use of Character-Level Translation with Sparse and Noisy Datasets
Jörg Tiedemann
Preslav Nakov
24
21
0
27 Sep 2021
Word Alignment by Fine-tuning Embeddings on Parallel Corpora
Word Alignment by Fine-tuning Embeddings on Parallel Corpora
Zi-Yi Dou
Graham Neubig
90
255
0
20 Jan 2021
CharacterBERT: Reconciling ELMo and BERT for Word-Level Open-Vocabulary
  Representations From Characters
CharacterBERT: Reconciling ELMo and BERT for Word-Level Open-Vocabulary Representations From Characters
Hicham El Boukkouri
Olivier Ferret
Thomas Lavergne
Hiroshi Noji
Pierre Zweigenbaum
Junichi Tsujii
63
155
0
20 Oct 2020
1