ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.01954
  4. Cited By
DRAG: Distilling RAG for SLMs from LLMs to Transfer Knowledge and Mitigate Hallucination via Evidence and Graph-based Distillation

DRAG: Distilling RAG for SLMs from LLMs to Transfer Knowledge and Mitigate Hallucination via Evidence and Graph-based Distillation

Annual Meeting of the Association for Computational Linguistics (ACL), 2025
2 June 2025
Jennifer Chen
Aidar Myrzakhan
Yaxin Luo
Hassaan Muhammad Khan
Sondos Mahmoud Bsharat
Zhiqiang Shen
    VLM
ArXiv (abs)PDFHTMLHuggingFace (1 upvotes)

Papers citing "DRAG: Distilling RAG for SLMs from LLMs to Transfer Knowledge and Mitigate Hallucination via Evidence and Graph-based Distillation"

2 / 2 papers shown
A Survey on Collaborating Small and Large Language Models for Performance, Cost-effectiveness, Cloud-edge Privacy, and Trustworthiness
A Survey on Collaborating Small and Large Language Models for Performance, Cost-effectiveness, Cloud-edge Privacy, and Trustworthiness
Fali Wang
Jihai Chen
Shuhua Yang
Ali Al-Lawati
Linli Tang
Hui Liu
Suhang Wang
170
2
0
14 Oct 2025
HetaRAG: Hybrid Deep Retrieval-Augmented Generation across Heterogeneous Data Stores
HetaRAG: Hybrid Deep Retrieval-Augmented Generation across Heterogeneous Data Stores
Guohang Yan
Y. Zhang
Pinlong Cai
Botian Shi
Song Mao
Hongwei Zhang
Yaoze Zhang
Hairong Zhang
Xinyu Cai
Ding Wang
117
0
0
12 Sep 2025
1