Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2410.13321
Cited By
Mitigating Hallucinations in Large Vision-Language Models via Summary-Guided Decoding
20 February 2025
Kyungmin Min
Minbeom Kim
Kang-il Lee
Dongryeol Lee
Kyomin Jung
MLLM
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Mitigating Hallucinations in Large Vision-Language Models via Summary-Guided Decoding"
3 / 3 papers shown
Title
Are LLM-Judges Robust to Expressions of Uncertainty? Investigating the effect of Epistemic Markers on LLM-based Evaluation
Dongryeol Lee
Yerin Hwang
Yongil Kim
Joonsuk Park
Kyomin Jung
ELM
68
4
0
28 Oct 2024
Hallucination of Multimodal Large Language Models: A Survey
Zechen Bai
Pichao Wang
Tianjun Xiao
Tong He
Zongbo Han
Zheng Zhang
Mike Zheng Shou
VLM
LRM
71
136
0
29 Apr 2024
AdvisorQA: Towards Helpful and Harmless Advice-seeking Question Answering with Collective Intelligence
Minbeom Kim
Hwanhee Lee
Joonsuk Park
Hwaran Lee
Kyomin Jung
19
1
0
18 Apr 2024
1