Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2402.11794
Cited By
Unveiling the Magic: Investigating Attention Distillation in Retrieval-augmented Generation
19 February 2024
Zizhong Li
Haopeng Zhang
Jiawei Zhang
RALM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Unveiling the Magic: Investigating Attention Distillation in Retrieval-augmented Generation"
4 / 4 papers shown
Title
How Language Model Hallucinations Can Snowball
Muru Zhang
Ofir Press
William Merrill
Alisa Liu
Noah A. Smith
HILM
LRM
78
246
0
22 May 2023
Generate rather than Retrieve: Large Language Models are Strong Context Generators
W. Yu
Dan Iter
Shuohang Wang
Yichong Xu
Mingxuan Ju
Soumya Sanyal
Chenguang Zhu
Michael Zeng
Meng-Long Jiang
RALM
AIMat
215
318
0
21 Sep 2022
Training language models to follow instructions with human feedback
Long Ouyang
Jeff Wu
Xu Jiang
Diogo Almeida
Carroll L. Wainwright
...
Amanda Askell
Peter Welinder
Paul Christiano
Jan Leike
Ryan J. Lowe
OSLM
ALM
303
11,730
0
04 Mar 2022
Distilling Knowledge from Reader to Retriever for Question Answering
Gautier Izacard
Edouard Grave
RALM
176
249
0
08 Dec 2020
1