[2603.03296] PlugMem: A Task-Agnostic Plugin Memory Module for LLM Agents
About this article
Abstract page for arXiv paper 2603.03296: PlugMem: A Task-Agnostic Plugin Memory Module for LLM Agents
Computer Science > Computation and Language arXiv:2603.03296 (cs) [Submitted on 6 Feb 2026] Title:PlugMem: A Task-Agnostic Plugin Memory Module for LLM Agents Authors:Ke Yang, Zixi Chen, Xuan He, Jize Jiang, Michel Galley, Chenglong Wang, Jianfeng Gao, Jiawei Han, ChengXiang Zhai View a PDF of the paper titled PlugMem: A Task-Agnostic Plugin Memory Module for LLM Agents, by Ke Yang and 8 other authors View PDF HTML (experimental) Abstract:Long-term memory is essential for large language model (LLM) agents operating in complex environments, yet existing memory designs are either task-specific and non-transferable, or task-agnostic but less effective due to low task-relevance and context explosion from raw memory retrieval. We propose PlugMem, a task-agnostic plugin memory module that can be attached to arbitrary LLM agents without task-specific redesign. Motivated by the fact that decision-relevant information is concentrated as abstract knowledge rather than raw experience, we draw on cognitive science to structure episodic memories into a compact, extensible knowledge-centric memory graph that explicitly represents propositional and prescriptive knowledge. This representation enables efficient memory retrieval and reasoning over task-relevant knowledge, rather than verbose raw trajectories, and departs from other graph-based methods like GraphRAG by treating knowledge as the unit of memory access and organization instead of entities or text chunks. We evaluate PlugMem unc...