ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2403.01373
  4. Cited By
Quantity Matters: Towards Assessing and Mitigating Number Hallucination
  in Large Vision-Language Models

Quantity Matters: Towards Assessing and Mitigating Number Hallucination in Large Vision-Language Models

3 March 2024
Huixuan Zhang
Junzhe Zhang
Xiaojun Wan
    MLLM
ArXivPDFHTML

Papers citing "Quantity Matters: Towards Assessing and Mitigating Number Hallucination in Large Vision-Language Models"

1 / 1 papers shown
Title
Mitigating Object Hallucinations in Large Vision-Language Models through
  Visual Contrastive Decoding
Mitigating Object Hallucinations in Large Vision-Language Models through Visual Contrastive Decoding
Sicong Leng
Hang Zhang
Guanzheng Chen
Xin Li
Shijian Lu
Chunyan Miao
Li Bing
VLM
MLLM
85
196
0
28 Nov 2023
1