ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2504.01018
29
0

Self-Routing RAG: Binding Selective Retrieval with Knowledge Verbalization

1 April 2025
Di Wu
Jia-Chen Gu
Kai-Wei Chang
Nanyun Peng
ArXivPDFHTML
Abstract

Selective retrieval improves retrieval-augmented generation (RAG) by reducing distractions from low-quality retrievals and improving efficiency. However, existing approaches under-utilize the inherent knowledge of large language models (LLMs), leading to suboptimal retrieval decisions and degraded generation performance. To bridge this gap, we propose Self-Routing RAG (SR-RAG), a novel framework that binds selective retrieval with knowledge verbalization. SR-RAG enables an LLM to dynamically decide between external retrieval and verbalizing its own parametric knowledge. To this end, we design a multi-task objective that jointly optimizes an LLM on knowledge source selection, knowledge verbalization, and response generation. We further introduce dynamic knowledge source inference via nearest neighbor search to improve the accuracy of knowledge source decision under domain shifts. Fine-tuning three LLMs with SR-RAG significantly improves both their response accuracy and inference latency. Compared to the strongest selective retrieval baseline, SR-RAG reduces retrievals by 29% while improving the performance by 5.1%.

View on arXiv
@article{wu2025_2504.01018,
  title={ Self-Routing RAG: Binding Selective Retrieval with Knowledge Verbalization },
  author={ Di Wu and Jia-Chen Gu and Kai-Wei Chang and Nanyun Peng },
  journal={arXiv preprint arXiv:2504.01018},
  year={ 2025 }
}
Comments on this paper