Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2311.08348
Cited By
MC
2
^2
2
: Towards Transparent and Culturally-Aware NLP for Minority Languages in China
14 November 2023
Chen Zhang
Mingxu Tao
Quzhe Huang
Jiuheng Lin
Zhibin Chen
Yansong Feng
Re-assign community
ArXiv
PDF
HTML
Papers citing
"MC$^2$: Towards Transparent and Culturally-Aware NLP for Minority Languages in China"
9 / 9 papers shown
Title
Large Language Models in Numberland: A Quick Test of Their Numerical Reasoning Abilities
Roussel Rahman
ReLM
ELM
LRM
46
0
0
31 Mar 2025
Quality Does Matter: A Detailed Look at the Quality and Utility of Web-Mined Parallel Corpora
Surangika Ranathunga
Nisansa de Silva
Menan Velayuthan
Aloka Fernando
Charitha Rathnayake
28
10
0
12 Feb 2024
WanJuan: A Comprehensive Multimodal Dataset for Advancing English and Chinese Large Models
Conghui He
Zhenjiang Jin
Chaoxi Xu
Jiantao Qiu
Bin Wang
Wei Li
Hang Yan
Jiaqi Wang
Da Lin
56
32
0
21 Aug 2023
DADA: Dialect Adaptation via Dynamic Aggregation of Linguistic Rules
Yanchen Liu
William B. Held
Diyi Yang
43
10
0
22 May 2023
A Survey of Code-switching: Linguistic and Social Perspectives for Language Technologies
A. Seza Doğruöz
Sunayana Sitaram
Barbara E. Bullock
Almeida Jacqueline Toribio
70
72
0
05 Jan 2023
Language Models are Multilingual Chain-of-Thought Reasoners
Freda Shi
Mirac Suzgun
Markus Freitag
Xuezhi Wang
Suraj Srivats
...
Yi Tay
Sebastian Ruder
Denny Zhou
Dipanjan Das
Jason W. Wei
ReLM
LRM
165
320
0
06 Oct 2022
Deduplicating Training Data Makes Language Models Better
Katherine Lee
Daphne Ippolito
A. Nystrom
Chiyuan Zhang
Douglas Eck
Chris Callison-Burch
Nicholas Carlini
SyDa
237
588
0
14 Jul 2021
When Being Unseen from mBERT is just the Beginning: Handling New Languages With Multilingual Language Models
Benjamin Muller
Antonis Anastasopoulos
Benoît Sagot
Djamé Seddah
LRM
109
165
0
24 Oct 2020
Megatron-LM: Training Multi-Billion Parameter Language Models Using Model Parallelism
M. Shoeybi
M. Patwary
Raul Puri
P. LeGresley
Jared Casper
Bryan Catanzaro
MoE
243
1,791
0
17 Sep 2019
1