ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.06975
110
3

Position: Episodic Memory is the Missing Piece for Long-Term LLM Agents

10 February 2025
Mathis Pink
Qinyuan Wu
Vy A. Vo
Javier S. Turek
Jianing Mu
Alexander G. Huth
Mariya Toneva
    LLMAG
    KELM
    CLL
ArXivPDFHTML
Abstract

As Large Language Models (LLMs) evolve from text-completion tools into fully fledged agents operating in dynamic environments, they must address the challenge of continually learning and retaining long-term knowledge. Many biological systems solve these challenges with episodic memory, which supports single-shot learning of instance-specific contexts. Inspired by this, we present an episodic memory framework for LLM agents, centered around five key properties of episodic memory that underlie adaptive and context-sensitive behavior. With various research efforts already partially covering these properties, this position paper argues that now is the right time for an explicit, integrated focus on episodic memory to catalyze the development of long-term agents. To this end, we outline a roadmap that unites several research directions under the goal to support all five properties of episodic memory for more efficient long-term LLM agents.

View on arXiv
@article{pink2025_2502.06975,
  title={ Position: Episodic Memory is the Missing Piece for Long-Term LLM Agents },
  author={ Mathis Pink and Qinyuan Wu and Vy Ai Vo and Javier Turek and Jianing Mu and Alexander Huth and Mariya Toneva },
  journal={arXiv preprint arXiv:2502.06975},
  year={ 2025 }
}
Comments on this paper