ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2311.13878
  4. Cited By
Minimizing Factual Inconsistency and Hallucination in Large Language
  Models

Minimizing Factual Inconsistency and Hallucination in Large Language Models

23 November 2023
Muneeswaran Irulandi
Shreya Saxena
Siva Prasad
M. V. Sai Prakash
Advaith Shankar
V. Varun
Vishal Vaddina
Saisubramaniam Gopalakrishnan
    HILM
ArXivPDFHTML

Papers citing "Minimizing Factual Inconsistency and Hallucination in Large Language Models"

2 / 2 papers shown
Title
Factual Error Correction for Abstractive Summarization Models
Factual Error Correction for Abstractive Summarization Models
Mengyao Cao
Yue Dong
Jiapeng Wu
Jackie C.K. Cheung
HILM
KELM
167
139
0
17 Oct 2020
PubMedQA: A Dataset for Biomedical Research Question Answering
PubMedQA: A Dataset for Biomedical Research Question Answering
Qiao Jin
Bhuwan Dhingra
Zhengping Liu
William W. Cohen
Xinghua Lu
196
791
0
13 Sep 2019
1