Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2407.09726
Cited By
On Mitigating Code LLM Hallucinations with API Documentation
13 July 2024
Nihal Jain
Robert Kwiatkowski
Baishakhi Ray
M. K. Ramanathan
Varun Kumar
Re-assign community
ArXiv
PDF
HTML
Papers citing
"On Mitigating Code LLM Hallucinations with API Documentation"
7 / 7 papers shown
Title
Hallucination by Code Generation LLMs: Taxonomy, Benchmarks, Mitigation, and Challenges
Yunseo Lee
John Youngeun Song
Dongsun Kim
Jindae Kim
Mijung Kim
Jaechang Nam
HILM
LRM
33
0
0
29 Apr 2025
To Believe or Not to Believe Your LLM
Yasin Abbasi-Yadkori
Ilja Kuzborskij
András György
Csaba Szepesvári
UQCV
53
39
0
04 Jun 2024
Self-RAG: Learning to Retrieve, Generate, and Critique through Self-Reflection
Akari Asai
Zeqiu Wu
Yizhong Wang
Avirup Sil
Hannaneh Hajishirzi
RALM
138
600
0
17 Oct 2023
How Language Model Hallucinations Can Snowball
Muru Zhang
Ofir Press
William Merrill
Alisa Liu
Noah A. Smith
HILM
LRM
75
246
0
22 May 2023
Lift Yourself Up: Retrieval-augmented Text Generation with Self Memory
Xin Cheng
Di Luo
Xiuying Chen
Lemao Liu
Dongyan Zhao
Rui Yan
RALM
134
86
0
03 May 2023
Grounded Copilot: How Programmers Interact with Code-Generating Models
Shraddha Barke
M. James
Nadia Polikarpova
136
212
0
30 Jun 2022
Scaling Laws for Neural Language Models
Jared Kaplan
Sam McCandlish
T. Henighan
Tom B. Brown
B. Chess
R. Child
Scott Gray
Alec Radford
Jeff Wu
Dario Amodei
220
3,054
0
23 Jan 2020
1