[2602.22402] Contextual Memory Virtualisation: DAG-Based State Management and Structurally Lossless Trimming for LLM Agents

[2602.22402] Contextual Memory Virtualisation: DAG-Based State Management and Structurally Lossless Trimming for LLM Agents

arXiv - AI 4 min read Article

Summary

The paper presents Contextual Memory Virtualisation (CMV), a novel system for managing state in large language models (LLMs) using a Directed Acyclic Graph (DAG) structure to enhance context reuse and reduce token counts through a lossless trimming algorithm.

Why It Matters

As LLMs are increasingly used for complex reasoning tasks, managing their state effectively becomes critical. CMV addresses the limitations of current context management by preserving essential information while optimizing resource usage, making it relevant for developers and researchers in AI and software engineering.

Key Takeaways

  • CMV employs a DAG structure for effective state management in LLMs.
  • The three-pass trimming algorithm achieves a mean token reduction of 20%, with up to 86% in specific cases.
  • The system maintains verbatim user and assistant interactions, ensuring no loss of critical information.
  • A case study shows significant efficiency gains, especially in sessions involving mixed tool use.
  • CMV's approach can enhance the economic viability of prompt caching in LLM applications.

Computer Science > Software Engineering arXiv:2602.22402 (cs) [Submitted on 25 Feb 2026] Title:Contextual Memory Virtualisation: DAG-Based State Management and Structurally Lossless Trimming for LLM Agents Authors:Cosmo Santoni View a PDF of the paper titled Contextual Memory Virtualisation: DAG-Based State Management and Structurally Lossless Trimming for LLM Agents, by Cosmo Santoni View PDF HTML (experimental) Abstract:As large language models engage in extended reasoning tasks, they accumulate significant state -- architectural mappings, trade-off decisions, codebase conventions -- within the context window. This understanding is lost when sessions reach context limits and undergo lossy compaction. We propose Contextual Memory Virtualisation (CMV), a system that treats accumulated LLM understanding as version-controlled state. Borrowing from operating system virtual memory, CMV models session history as a Directed Acyclic Graph (DAG) with formally defined snapshot, branch, and trim primitives that enable context reuse across independent parallel sessions. We introduce a three-pass structurally lossless trimming algorithm that preserves every user message and assistant response verbatim while reducing token counts by a mean of 20% and up to 86% for sessions with significant overhead by stripping mechanical bloat such as raw tool outputs, base64 images, and metadata. A single-user case-study evaluation across 76 real-world coding sessions demonstrates that trimming remai...

Related Articles

I Asked ChatGPT 500 Questions. Here Are the Ads I Saw Most Often | WIRED
Llms

I Asked ChatGPT 500 Questions. Here Are the Ads I Saw Most Often | WIRED

Ads are rolling out across the US on ChatGPT’s free tier. I asked OpenAI's bot 500 questions to see what these ads were like and how they...

Wired - AI · 9 min ·
Llms

Abacus.Ai Claw LLM consumes an incredible amount of credit without any usage :(

Three days ago, I clicked the "Deploy OpenClaw In Seconds" button to get an overview of the new service, but I didn't build any automatio...

Reddit - Artificial Intelligence · 1 min ·
Google’s Gemini AI app debuts in Hong Kong
Llms

Google’s Gemini AI app debuts in Hong Kong

Tech giant’s chatbot service tops Apple’s app store chart in the city.

AI Tools & Products · 2 min ·
Google Launches Gemini Import Tools to Poach Users From Rival AI Apps
Llms

Google Launches Gemini Import Tools to Poach Users From Rival AI Apps

Anyone looking to switch their AI assistant will find it surprisingly easy, as it only takes a few steps to move from A to B. This is not...

AI Tools & Products · 4 min ·
More in Llms: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime