15
443

A Survey on In-context Learning

Qingxiu Dong
Lei Li
Damai Dai
Ce Zheng
Jingyuan Ma
Rui Li
Heming Xia
Jingjing Xu
Zhiyong Wu
Baobao Chang
Xu Sun
Lei Li
Zhifang Sui
Abstract

With the increasing capabilities of large language models (LLMs), in-context learning (ICL) has emerged as a new paradigm for natural language processing (NLP), where LLMs make predictions based on contexts augmented with a few examples. It has been a significant trend to explore ICL to evaluate and extrapolate the ability of LLMs. In this paper, we aim to survey and summarize the progress and challenges of ICL. We first present a formal definition of ICL and clarify its correlation to related studies. Then, we organize and discuss advanced techniques, including training strategies, prompt designing strategies, and related analysis. Additionally, we explore various ICL application scenarios, such as data engineering and knowledge updating. Finally, we address the challenges of ICL and suggest potential directions for further research. We hope that our work can encourage more research on uncovering how ICL works and improving ICL.

View on arXiv
Comments on this paper