ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2409.17892
  4. Cited By
EMMA-500: Enhancing Massively Multilingual Adaptation of Large Language Models

EMMA-500: Enhancing Massively Multilingual Adaptation of Large Language Models

26 September 2024
Shaoxiong Ji
Zihao Li
Indraneil Paul
Jaakko Paavola
Peiqin Lin
Pinzhen Chen
Dayyán O'Brien
Hengyu Luo
Hinrich Schütze
Jörg Tiedemann
Barry Haddow
    CLL
ArXivPDFHTML

Papers citing "EMMA-500: Enhancing Massively Multilingual Adaptation of Large Language Models"

1 / 1 papers shown
Title
Rethinking Multilingual Continual Pretraining: Data Mixing for Adapting LLMs Across Languages and Resources
Rethinking Multilingual Continual Pretraining: Data Mixing for Adapting LLMs Across Languages and Resources
Zihao Li
Shaoxiong Ji
Hengyu Luo
Jörg Tiedemann
CLL
33
0
0
05 Apr 2025
1