SOTAVerified

TagRouter: Learning Route to LLMs through Tags for Open-Domain Text Generation Tasks

2025-06-14Code Available1· sign in to hype

Zhou Chen, Zhiqiang Wei, Yuqi Bai, Xue Xiong, Jianmin Wu

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Model routing allocates queries to the suitable model, improving system performance while reducing costs. However, existing routing methods face practical limitations that hinder scalability in large-scale applications and struggle to keep up with the rapid growth of the large language model (LLM) ecosystem. To tackle these challenges, we propose TagRouter, a training-free model routing method designed to optimize the synergy among multiple LLMs for open-domain text generation tasks. Experimental results demonstrate that TagRouter outperforms 13 baseline methods, increasing the accept rate of system by 6.15% and reducing costs by 17.20%, achieving optimal cost-efficiency. Our findings provides the LLM community with an efficient and scalable solution for model ensembling, offering users an evolvable "super model."

Tasks

Reproductions