ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.13640
  4. Cited By
Extending Multilingual BERT to Low-Resource Languages

Extending Multilingual BERT to Low-Resource Languages

28 April 2020
Zihan Wang
Karthikeyan K
Stephen D. Mayhew
Dan Roth
    VLM
ArXivPDFHTML

Papers citing "Extending Multilingual BERT to Low-Resource Languages"

26 / 26 papers shown
Title
LUSIFER: Language Universal Space Integration for Enhanced Multilingual Embeddings with Large Language Models
LUSIFER: Language Universal Space Integration for Enhanced Multilingual Embeddings with Large Language Models
Hieu Man
Nghia Trung Ngo
Viet Dac Lai
Ryan Rossi
Franck Dernoncourt
T. Nguyen
142
0
0
01 Jan 2025
Adapters for Altering LLM Vocabularies: What Languages Benefit the Most?
Adapters for Altering LLM Vocabularies: What Languages Benefit the Most?
HyoJung Han
Akiko Eriguchi
Haoran Xu
Hieu T. Hoang
Marine Carpuat
Huda Khayrallah
VLM
32
2
0
12 Oct 2024
UniBridge: A Unified Approach to Cross-Lingual Transfer Learning for
  Low-Resource Languages
UniBridge: A Unified Approach to Cross-Lingual Transfer Learning for Low-Resource Languages
Trinh Pham
Khoi M. Le
Luu Anh Tuan
34
1
0
14 Jun 2024
A Systematic Analysis of Subwords and Cross-Lingual Transfer in
  Multilingual Translation
A Systematic Analysis of Subwords and Cross-Lingual Transfer in Multilingual Translation
Francois Meyer
Jan Buys
29
1
0
29 Mar 2024
Lexicon and Rule-based Word Lemmatization Approach for the Somali
  Language
Lexicon and Rule-based Word Lemmatization Approach for the Somali Language
Shafie Abdi Mohamed
Muhidin A. Mohamed
10
2
0
03 Aug 2023
Improving Cross-lingual Information Retrieval on Low-Resource Languages
  via Optimal Transport Distillation
Improving Cross-lingual Information Retrieval on Low-Resource Languages via Optimal Transport Distillation
Zhiqi Huang
Puxuan Yu
James Allan
VLM
30
26
0
29 Jan 2023
Dissociating language and thought in large language models
Dissociating language and thought in large language models
Kyle Mahowald
Anna A. Ivanova
I. Blank
Nancy Kanwisher
J. Tenenbaum
Evelina Fedorenko
ELM
ReLM
25
209
0
16 Jan 2023
BLOOM+1: Adding Language Support to BLOOM for Zero-Shot Prompting
BLOOM+1: Adding Language Support to BLOOM for Zero-Shot Prompting
Zheng-Xin Yong
Hailey Schoelkopf
Niklas Muennighoff
Alham Fikri Aji
David Ifeoluwa Adelani
...
Genta Indra Winata
Stella Biderman
Edward Raff
Dragomir R. Radev
Vassilina Nikoulina
CLL
VLM
AI4CE
LRM
27
81
0
19 Dec 2022
Extending the Subwording Model of Multilingual Pretrained Models for New
  Languages
Extending the Subwording Model of Multilingual Pretrained Models for New Languages
K. Imamura
Eiichiro Sumita
VLM
27
3
0
29 Nov 2022
COVID-19-related Nepali Tweets Classification in a Low Resource Setting
COVID-19-related Nepali Tweets Classification in a Low Resource Setting
Rabin Adhikari
Safal Thapaliya
Nirajan Basnet
S. Poudel
Aman Shakya
Bishesh Khanal
12
3
0
11 Oct 2022
Language Modelling with Pixels
Language Modelling with Pixels
Phillip Rust
Jonas F. Lotz
Emanuele Bugliarello
Elizabeth Salesky
Miryam de Lhoneux
Desmond Elliott
VLM
30
46
0
14 Jul 2022
Improving Low-Resource Speech Recognition with Pretrained Speech Models:
  Continued Pretraining vs. Semi-Supervised Training
Improving Low-Resource Speech Recognition with Pretrained Speech Models: Continued Pretraining vs. Semi-Supervised Training
Mitchell DeHaven
J. Billa
VLM
AI4TS
15
8
0
01 Jul 2022
Lifting the Curse of Multilinguality by Pre-training Modular
  Transformers
Lifting the Curse of Multilinguality by Pre-training Modular Transformers
Jonas Pfeiffer
Naman Goyal
Xi Victoria Lin
Xian Li
James Cross
Sebastian Riedel
Mikel Artetxe
LRM
40
138
0
12 May 2022
Match the Script, Adapt if Multilingual: Analyzing the Effect of
  Multilingual Pretraining on Cross-lingual Transferability
Match the Script, Adapt if Multilingual: Analyzing the Effect of Multilingual Pretraining on Cross-lingual Transferability
Yoshinari Fujinuma
Jordan L. Boyd-Graber
Katharina Kann
AAML
51
23
0
21 Mar 2022
Expanding Pretrained Models to Thousands More Languages via
  Lexicon-based Adaptation
Expanding Pretrained Models to Thousands More Languages via Lexicon-based Adaptation
Xinyi Wang
Sebastian Ruder
Graham Neubig
21
60
0
17 Mar 2022
Between words and characters: A Brief History of Open-Vocabulary
  Modeling and Tokenization in NLP
Between words and characters: A Brief History of Open-Vocabulary Modeling and Tokenization in NLP
Sabrina J. Mielke
Zaid Alyafeai
Elizabeth Salesky
Colin Raffel
Manan Dey
...
Arun Raja
Chenglei Si
Wilson Y. Lee
Benoît Sagot
Samson Tan
30
140
0
20 Dec 2021
Focusing on Potential Named Entities During Active Label Acquisition
Focusing on Potential Named Entities During Active Label Acquisition
Ali Osman Berk Şapcı
Oznur Tastan
Reyyan Yeniterzi
22
2
0
06 Nov 2021
IndoBERTweet: A Pretrained Language Model for Indonesian Twitter with
  Effective Domain-Specific Vocabulary Initialization
IndoBERTweet: A Pretrained Language Model for Indonesian Twitter with Effective Domain-Specific Vocabulary Initialization
Fajri Koto
Jey Han Lau
Timothy Baldwin
VLM
55
82
0
10 Sep 2021
Subword Mapping and Anchoring across Languages
Subword Mapping and Anchoring across Languages
Giorgos Vernikos
Andrei Popescu-Belis
62
12
0
09 Sep 2021
A Primer on Pretrained Multilingual Language Models
A Primer on Pretrained Multilingual Language Models
Sumanth Doddapaneni
Gowtham Ramesh
Mitesh M. Khapra
Anoop Kunchukuttan
Pratyush Kumar
LRM
43
73
0
01 Jul 2021
Revisiting the Primacy of English in Zero-shot Cross-lingual Transfer
Revisiting the Primacy of English in Zero-shot Cross-lingual Transfer
Iulia Turc
Kenton Lee
Jacob Eisenstein
Ming-Wei Chang
Kristina Toutanova
24
58
0
30 Jun 2021
Specializing Multilingual Language Models: An Empirical Study
Specializing Multilingual Language Models: An Empirical Study
Ethan C. Chau
Noah A. Smith
25
27
0
16 Jun 2021
CANINE: Pre-training an Efficient Tokenization-Free Encoder for Language
  Representation
CANINE: Pre-training an Efficient Tokenization-Free Encoder for Language Representation
J. Clark
Dan Garrette
Iulia Turc
John Wieting
25
210
0
11 Mar 2021
UNKs Everywhere: Adapting Multilingual Language Models to New Scripts
UNKs Everywhere: Adapting Multilingual Language Models to New Scripts
Jonas Pfeiffer
Ivan Vulić
Iryna Gurevych
Sebastian Ruder
22
126
0
31 Dec 2020
Orthogonal Language and Task Adapters in Zero-Shot Cross-Lingual
  Transfer
Orthogonal Language and Task Adapters in Zero-Shot Cross-Lingual Transfer
M. Vidoni
Ivan Vulić
Goran Glavas
31
27
0
11 Dec 2020
Word Translation Without Parallel Data
Word Translation Without Parallel Data
Alexis Conneau
Guillaume Lample
MarcÁurelio Ranzato
Ludovic Denoyer
Hervé Jégou
169
1,635
0
11 Oct 2017
1