ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.00675
45
170

Rethinking Memory in AI: Taxonomy, Operations, Topics, and Future Directions

1 May 2025
Yiming Du
Wenyu Huang
Danna Zheng
Zhaowei Wang
Sébastien Montella
Mirella Lapata
Kam-Fai Wong
Jeff Z. Pan
    KELM
    MU
ArXivPDFHTML
Abstract

Memory is a fundamental component of AI systems, underpinning large language models (LLMs) based agents. While prior surveys have focused on memory applications with LLMs, they often overlook the atomic operations that underlie memory dynamics. In this survey, we first categorize memory representations into parametric, contextual structured, and contextual unstructured and then introduce six fundamental memory operations: Consolidation, Updating, Indexing, Forgetting, Retrieval, and Compression. We systematically map these operations to the most relevant research topics across long-term, long-context, parametric modification, and multi-source memory. By reframing memory systems through the lens of atomic operations and representation types, this survey provides a structured and dynamic perspective on research, benchmark datasets, and tools related to memory in AI, clarifying the functional interplay in LLMs based agents while outlining promising directions for future research\footnote{The paper list, datasets, methods and tools are available at \href{this https URL}{this https URL\_Memory\_in\_AI}.}.

View on arXiv
@article{du2025_2505.00675,
  title={ Rethinking Memory in AI: Taxonomy, Operations, Topics, and Future Directions },
  author={ Yiming Du and Wenyu Huang and Danna Zheng and Zhaowei Wang and Sebastien Montella and Mirella Lapata and Kam-Fai Wong and Jeff Z. Pan },
  journal={arXiv preprint arXiv:2505.00675},
  year={ 2025 }
}
Comments on this paper