ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2410.15778
  4. Cited By
Reducing Hallucinations in Vision-Language Models via Latent Space
  Steering

Reducing Hallucinations in Vision-Language Models via Latent Space Steering

21 October 2024
Sheng Liu
Haotian Ye
Lei Xing
James Zou
    VLM
    LLMSV
ArXivPDFHTML

Papers citing "Reducing Hallucinations in Vision-Language Models via Latent Space Steering"

2 / 2 papers shown
Title
Treble Counterfactual VLMs: A Causal Approach to Hallucination
Treble Counterfactual VLMs: A Causal Approach to Hallucination
Li Li
Jiashu Qu
Yuxiao Zhou
Yuehan Qin
Tiankai Yang
Yue Zhao
68
1
0
08 Mar 2025
Nullu: Mitigating Object Hallucinations in Large Vision-Language Models via HalluSpace Projection
Nullu: Mitigating Object Hallucinations in Large Vision-Language Models via HalluSpace Projection
Le Yang
Ziwei Zheng
Boxu Chen
Zhengyu Zhao
Chenhao Lin
Chao Shen
VLM
132
3
0
18 Dec 2024
1