[2603.19664] The Residual Stream Is All You Need: On the Redundancy of the KV Cache in Transformer Inference
About this article
Abstract page for arXiv paper 2603.19664: The Residual Stream Is All You Need: On the Redundancy of the KV Cache in Transformer Inference
Computer Science > Machine Learning arXiv:2603.19664 (cs) [Submitted on 20 Mar 2026] Title:The Residual Stream Is All You Need: On the Redundancy of the KV Cache in Transformer Inference Authors:Kaleem Ullah Qasim, Jiashu Zhang, Muhammad Kafeel Shaheen, Razan Alharith, Heying Zhang View a PDF of the paper titled The Residual Stream Is All You Need: On the Redundancy of the KV Cache in Transformer Inference, by Kaleem Ullah Qasim and 4 other authors View PDF HTML (experimental) Abstract:The key-value (KV) cache is widely treated as essential state in transformer inference, and a large body of work engineers policies to compress, evict, or approximate its entries. We prove that this state is entirely redundant: keys and values at every layer are deterministic projections of the residual stream, and recomputing them from a single residual vector per token incurs exactly zero reconstruction error, not approximately, but bit-identically. We verify this across six models from four architecture families (135M to 4B parameters). Cross-task residual patching at every layer produces D_KL = 0 between patched and original output distributions, confirming that the residual stream satisfies a Markov property and is the sole information-carrying state. Removing the cache entirely and recomputing from scratch yields token-identical output under greedy decoding on all models tested. We build on this result with KV-Direct, a bounded-memory inference scheme that checkpoints residual vectors ...