131

TagRouter: Learning Route to LLMs through Tags for Open-Domain Text Generation Tasks

Annual Meeting of the Association for Computational Linguistics (ACL), 2025
Main:11 Pages
14 Figures
Bibliography:1 Pages
15 Tables
Appendix:14 Pages
Abstract

Model routing allocates queries to the suitable model, improving system performance while reducing costs. However, existing routing methods face practical limitations that hinder scalability in large-scale applications and struggle to keep up with the rapid growth of the large language model (LLM) ecosystem. To tackle these challenges, we propose TagRouter, a training-free model routing method designed to optimize the synergy among multiple LLMs for open-domain text generation tasks. Experimental results demonstrate that TagRouter outperforms 13 baseline methods, increasing the accept rate of system by 6.15% and reducing costs by 17.20%, achieving optimal cost-efficiency. Our findings provides the LLM community with an efficient and scalable solution for model ensembling, offering users an evolvable "super model."

View on arXiv
Comments on this paper