ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.09407
52
0

Got Compute, but No Data: Lessons From Post-training a Finnish LLM

12 March 2025
Elaine Zosa
Ville Komulainen
S. Pyysalo
ArXivPDFHTML
Abstract

As LLMs gain more popularity as chatbots and general assistants, methods have been developed to enable LLMs to follow instructions and align with human preferences. These methods have found success in the field, but their effectiveness has not been demonstrated outside of high-resource languages. In this work, we discuss our experiences in post-training an LLM for instruction-following for English and Finnish. We use a multilingual LLM to translate instruction and preference datasets from English to Finnish. We perform instruction tuning and preference optimization in English and Finnish and evaluate the instruction-following capabilities of the model in both languages. Our results show that with a few hundred Finnish instruction samples we can obtain competitive performance in Finnish instruction-following. We also found that although preference optimization in English offers some cross-lingual benefits, we obtain our best results by using preference data from both languages. We release our model, datasets, and recipes under open licenses atthis https URL

View on arXiv
@article{zosa2025_2503.09407,
  title={ Got Compute, but No Data: Lessons From Post-training a Finnish LLM },
  author={ Elaine Zosa and Ville Komulainen and Sampo Pyysalo },
  journal={arXiv preprint arXiv:2503.09407},
  year={ 2025 }
}
Comments on this paper