[2603.27116] The Price of Meaning: Why Every Semantic Memory System Forgets
About this article
Abstract page for arXiv paper 2603.27116: The Price of Meaning: Why Every Semantic Memory System Forgets
Computer Science > Artificial Intelligence arXiv:2603.27116 (cs) [Submitted on 28 Mar 2026] Title:The Price of Meaning: Why Every Semantic Memory System Forgets Authors:Sambartha Ray Barman, Andrey Starenky, Sofia Bodnar, Nikhil Narasimhan, Ashwin Gopinath View a PDF of the paper titled The Price of Meaning: Why Every Semantic Memory System Forgets, by Sambartha Ray Barman and 3 other authors View PDF HTML (experimental) Abstract:Every major AI memory system in production today organises information by meaning. That organisation enables generalisation, analogy, and conceptual retrieval -- but it comes at a price. We prove that the same geometric structure enabling semantic generalisation makes interference, forgetting, and false recall inescapable. We formalise this tradeoff for \textit{semantically continuous kernel-threshold memories}: systems whose retrieval score is a monotone function of an inner product in a semantic feature space with finite local intrinsic dimension. Within this class we derive four results: (1) semantically useful representations have finite effective rank; (2) finite local dimension implies positive competitor mass in retrieval neighbourhoods; (3) under growing memory, retention decays to zero, yielding power-law forgetting curves under power-law arrival statistics; (4) for associative lures satisfying a $\delta$-convexity condition, false recall cannot be eliminated by threshold tuning. We test these predictions across five architectures: vector...