19
0

Trans-Zero: Self-Play Incentivizes Large Language Models for Multilingual Translation Without Parallel Data

Abstract

The rise of Large Language Models (LLMs) has reshaped machine translation (MT), but multilingual MT still relies heavily on parallel data for supervised fine-tuning (SFT), facing challenges like data scarcity for low-resource languages and catastrophic forgetting. To address these issues, we propose TRANS-ZERO, a self-play framework that leverages only monolingual data and the intrinsic multilingual knowledge of LLM. TRANS-ZERO combines Genetic Monte-Carlo Tree Search (G-MCTS) with preference optimization, achieving strong translation performance that rivals supervised methods. Experiments demonstrate that this approach not only matches the performance of models trained on large-scale parallel data but also excels in non-English translation directions. Further analysis reveals that G-MCTS itself significantly enhances translation quality by exploring semantically consistent candidates through iterative translations, providing a robust foundation for the framework's succuss.

View on arXiv
@article{zou2025_2504.14669,
  title={ Trans-Zero: Self-Play Incentivizes Large Language Models for Multilingual Translation Without Parallel Data },
  author={ Wei Zou and Sen Yang and Yu Bao and Shujian Huang and Jiajun Chen and Shanbo Cheng },
  journal={arXiv preprint arXiv:2504.14669},
  year={ 2025 }
}
Comments on this paper