Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
2506.01954
Cited By
DRAG: Distilling RAG for SLMs from LLMs to Transfer Knowledge and Mitigate Hallucination via Evidence and Graph-based Distillation
Annual Meeting of the Association for Computational Linguistics (ACL), 2025
2 June 2025
Jennifer Chen
Aidar Myrzakhan
Yaxin Luo
Hassaan Muhammad Khan
Sondos Mahmoud Bsharat
Zhiqiang Shen
VLM
Re-assign community
ArXiv (abs)
PDF
HTML
HuggingFace (1 upvotes)
Papers citing
"DRAG: Distilling RAG for SLMs from LLMs to Transfer Knowledge and Mitigate Hallucination via Evidence and Graph-based Distillation"
2 / 2 papers shown
A Survey on Collaborating Small and Large Language Models for Performance, Cost-effectiveness, Cloud-edge Privacy, and Trustworthiness
Fali Wang
Jihai Chen
Shuhua Yang
Ali Al-Lawati
Linli Tang
Hui Liu
Suhang Wang
170
2
0
14 Oct 2025
HetaRAG: Hybrid Deep Retrieval-Augmented Generation across Heterogeneous Data Stores
Guohang Yan
Y. Zhang
Pinlong Cai
Botian Shi
Song Mao
Hongwei Zhang
Yaoze Zhang
Hairong Zhang
Xinyu Cai
Ding Wang
117
0
0
12 Sep 2025
1