Kakugo: Distillation of Low-Resource Languages into Small Language Models
Peter Devine
Mardhiyah Sanni
Farid Adilazuarda
Julieta Gil Loizaga
Barry Haddow
- SyDaVLM
Main:8 Pages
2 Figures
Bibliography:5 Pages
6 Tables
Appendix:3 Pages
Abstract
We present Kakugo, a novel and cost-effective pipeline designed to train general-purpose Small Language Models (SLMs) for low-resource languages using only the language name as input. By using a large teacher model to generate synthetic prompts and translate instruction datasets, we produced training data and SLMs for 54 low-resource languages. Evaluations across a diverse set of general natural language processing tasks, including translation, classification, and question answering, demonstrate that our pipeline consistently improves performance over base models. With a total generation and training cost of under $50 per language, Kakugo offers an accessible method for communities to develop language-specific AI.
View on arXivComments on this paper
