ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2403.06399
  4. Cited By
GlossLM: Multilingual Pretraining for Low-Resource Interlinear Glossing

GlossLM: Multilingual Pretraining for Low-Resource Interlinear Glossing

11 March 2024
Michael Ginn
Lindia Tjuatja
Taiqi He
Enora Rice
Graham Neubig
Alexis Palmer
Lori Levin University of Colorado
ArXivPDFHTML

Papers citing "GlossLM: Multilingual Pretraining for Low-Resource Interlinear Glossing"

2 / 2 papers shown
Title
Multiple Sources are Better Than One: Incorporating External Knowledge
  in Low-Resource Glossing
Multiple Sources are Better Than One: Incorporating External Knowledge in Low-Resource Glossing
Changbing Yang
Garrett Nicolai
Miikka Silfverberg
32
1
0
16 Jun 2024
CMULAB: An Open-Source Framework for Training and Deployment of Natural
  Language Processing Models
CMULAB: An Open-Source Framework for Training and Deployment of Natural Language Processing Models
Zaid A. W. Sheikh
Antonios Anastasopoulos
Shruti Rijhwani
Lindia Tjuatja
Robbie Jimerson
Graham Neubig
16
1
0
03 Apr 2024
1