ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2411.00865
20
0

Demo-Craft: Using In-Context Learning to Improve Code Generation in Large Language Models

30 October 2024
Nirmal Joshua Kapu
Mihit Sreejith
ArXivPDFHTML
Abstract

Generating executable code from natural language instructions using Large Language Models (LLMs) poses challenges such as semantic ambiguity and understanding taskspecific contexts. To address these issues, we propose a system called DemoCraft, which enhances code generation by leveraging in-context learning and demonstration selection, combined with latent concept learning. Latent concept learning introduces additional concept tokens, which are trainable embeddings that capture task-specific knowledge. We then test our system on two major datasets: MBPP and Humaneval. Our experimental results demonstrate that the proposed system achieves an approximate 2x increase in the pass@k metric compared to baseline models. Furthermore, we introduce two novel evaluation metrics: correctness@k and similarity@k. Our empirical studies indicate that our system attains nearly a 3x improvement in these metrics as well.

View on arXiv
@article{kapu2025_2411.00865,
  title={ Demo-Craft: Using In-Context Learning to Improve Code Generation in Large Language Models },
  author={ Nirmal Joshua Kapu and Mihit Sreejith },
  journal={arXiv preprint arXiv:2411.00865},
  year={ 2025 }
}
Comments on this paper