Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2404.17287
Cited By
When to Trust LLMs: Aligning Confidence with Response Quality
26 April 2024
Shuchang Tao
Liuyi Yao
Hanxing Ding
Yuexiang Xie
Qi Cao
Fei Sun
Jinyang Gao
Huawei Shen
Bolin Ding
Re-assign community
ArXiv
PDF
HTML
Papers citing
"When to Trust LLMs: Aligning Confidence with Response Quality"
3 / 3 papers shown
Title
Enhancing LLM Reliability via Explicit Knowledge Boundary Modeling
Hang Zheng
Hongshen Xu
Yuncong Liu
Lu Chen
Pascale Fung
Kai Yu
78
2
0
04 Mar 2025
Self-RAG: Learning to Retrieve, Generate, and Critique through Self-Reflection
Akari Asai
Zeqiu Wu
Yizhong Wang
Avirup Sil
Hannaneh Hajishirzi
RALM
144
600
0
17 Oct 2023
Fine-Tuning Language Models from Human Preferences
Daniel M. Ziegler
Nisan Stiennon
Jeff Wu
Tom B. Brown
Alec Radford
Dario Amodei
Paul Christiano
G. Irving
ALM
275
1,561
0
18 Sep 2019
1