ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2509.21473
112
0

Are Hallucinations Bad Estimations?

25 September 2025
Hude Liu
Jerry Yao-Chieh Hu
Jennifer Yuntong Zhang
Zhao Song
Han Liu
    HILM
ArXiv (abs)PDFHTMLGithub
Main:31 Pages
10 Figures
Bibliography:2 Pages
1 Tables
Abstract

We formalize hallucinations in generative models as failures to link an estimate to any plausible cause. Under this interpretation, we show that even loss-minimizing optimal estimators still hallucinate. We confirm this with a general high probability lower bound on hallucinate rate for generic data distributions. This reframes hallucination as structural misalignment between loss minimization and human-acceptable outputs, and hence estimation errors induced by miscalibration. Experiments on coin aggregation, open-ended QA, and text-to-image support our theory.

View on arXiv
Comments on this paper