ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2408.10441
  4. Cited By
Goldfish: Monolingual Language Models for 350 Languages

Goldfish: Monolingual Language Models for 350 Languages

19 August 2024
Tyler A. Chang
Catherine Arnett
Zhuowen Tu
Benjamin Bergen
    LRM
ArXivPDFHTML

Papers citing "Goldfish: Monolingual Language Models for 350 Languages"

5 / 5 papers shown
Title
Multi-granular Training Strategies for Robust Multi-hop Reasoning Over Noisy and Heterogeneous Knowledge Sources
Multi-granular Training Strategies for Robust Multi-hop Reasoning Over Noisy and Heterogeneous Knowledge Sources
Jackson Coleman
Isaiah Lawrence
Benjamin Turner
LRM
38
0
0
09 Feb 2025
Low-resource Machine Translation: what for? who for? An observational study on a dedicated Tetun language translation service
Low-resource Machine Translation: what for? who for? An observational study on a dedicated Tetun language translation service
Raphael Merx
Hanna Suominen
Adérito José Guterres Correia
Trevor Cohn
65
1
0
19 Nov 2024
Deduplicating Training Data Makes Language Models Better
Deduplicating Training Data Makes Language Models Better
Katherine Lee
Daphne Ippolito
A. Nystrom
Chiyuan Zhang
Douglas Eck
Chris Callison-Burch
Nicholas Carlini
SyDa
234
447
0
14 Jul 2021
The Tatoeba Translation Challenge -- Realistic Data Sets for Low
  Resource and Multilingual MT
The Tatoeba Translation Challenge -- Realistic Data Sets for Low Resource and Multilingual MT
Jörg Tiedemann
160
163
0
13 Oct 2020
Scaling Laws for Neural Language Models
Scaling Laws for Neural Language Models
Jared Kaplan
Sam McCandlish
T. Henighan
Tom B. Brown
B. Chess
R. Child
Scott Gray
Alec Radford
Jeff Wu
Dario Amodei
220
3,054
0
23 Jan 2020
1