ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2403.11025
  4. Cited By
Pre-Trained Language Models Represent Some Geographic Populations Better
  Than Others

Pre-Trained Language Models Represent Some Geographic Populations Better Than Others

16 March 2024
Jonathan Dunn
Benjamin Adams
Harish Tayyar Madabushi
ArXivPDFHTML

Papers citing "Pre-Trained Language Models Represent Some Geographic Populations Better Than Others"

3 / 3 papers shown
Title
Register Variation Remains Stable Across 60 Languages
Register Variation Remains Stable Across 60 Languages
Haipeng Li
Jonathan Dunn
A. Nini
26
8
0
20 Sep 2022
Measuring Geographic Performance Disparities of Offensive Language
  Classifiers
Measuring Geographic Performance Disparities of Offensive Language Classifiers
Brandon Lwowski
P. Rad
Anthony Rios
42
5
0
15 Sep 2022
Geographical Distance Is The New Hyperparameter: A Case Study Of Finding
  The Optimal Pre-trained Language For English-isiZulu Machine Translation
Geographical Distance Is The New Hyperparameter: A Case Study Of Finding The Optimal Pre-trained Language For English-isiZulu Machine Translation
Muhammad Umair Nasir
Innocent Amos Mchechesi
34
8
0
17 May 2022
1