ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2408.08333
  4. Cited By
CodeMirage: Hallucinations in Code Generated by Large Language Models

CodeMirage: Hallucinations in Code Generated by Large Language Models

14 August 2024
Vibhor Agarwal
Yulong Pei
Salwa Alamir
Xiaomo Liu
ArXivPDFHTML

Papers citing "CodeMirage: Hallucinations in Code Generated by Large Language Models"

2 / 2 papers shown
Title
Training language models to follow instructions with human feedback
Training language models to follow instructions with human feedback
Long Ouyang
Jeff Wu
Xu Jiang
Diogo Almeida
Carroll L. Wainwright
...
Amanda Askell
Peter Welinder
Paul Christiano
Jan Leike
Ryan J. Lowe
OSLM
ALM
301
11,730
0
04 Mar 2022
The Factual Inconsistency Problem in Abstractive Text Summarization: A
  Survey
The Factual Inconsistency Problem in Abstractive Text Summarization: A Survey
Yi-Chong Huang
Xiachong Feng
Xiaocheng Feng
Bing Qin
HILM
122
90
0
30 Apr 2021
1