ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.08030
52
1

Learning to Search Effective Example Sequences for In-Context Learning

11 March 2025
Xiang Gao
Ankita Sinha
Kamalika Das
ArXivPDFHTML
Abstract

Large language models (LLMs) demonstrate impressive few-shot learning capabilities, but their performance varies widely based on the sequence of in-context examples. Key factors influencing this include the sequence's length, composition, and arrangement, as well as its relation to the specific query. Existing methods often tackle these factors in isolation, overlooking their interdependencies. Moreover, the extensive search space for selecting optimal sequences complicates the development of a holistic approach. In this work, we introduce Beam Search-based Example Sequence Constructor (BESC), a novel method for learning to construct optimal example sequences. BESC addresses all key factors involved in sequence selection by considering them jointly during inference, while incrementally building the sequence. This design enables the use of beam search to significantly reduce the complexity of the search space. Experiments across various datasets and language models show notable improvements in performance.

View on arXiv
@article{gao2025_2503.08030,
  title={ Learning to Search Effective Example Sequences for In-Context Learning },
  author={ Xiang Gao and Ankita Sinha and Kamalika Das },
  journal={arXiv preprint arXiv:2503.08030},
  year={ 2025 }
}
Comments on this paper