[2510.25883] The Information-Theoretic Imperative: Compression and the Epistemic Foundations of Intelligence
About this article
Abstract page for arXiv paper 2510.25883: The Information-Theoretic Imperative: Compression and the Epistemic Foundations of Intelligence
Computer Science > Artificial Intelligence arXiv:2510.25883 (cs) [Submitted on 29 Oct 2025 (v1), last revised 28 Feb 2026 (this version, v2)] Title:The Information-Theoretic Imperative: Compression and the Epistemic Foundations of Intelligence Authors:Christian Dittrich, Jennifer Flygare Kinne View a PDF of the paper titled The Information-Theoretic Imperative: Compression and the Epistemic Foundations of Intelligence, by Christian Dittrich and 1 other authors View PDF Abstract:Why do brains and deep networks converge on similar representations? Task-optimized artificial neural networks quantitatively predict primate ventral stream responses despite radically different substrates and optimization dynamics. This convergence demands explanation beyond shared natural image statistics or task structure alone. The Compression Efficiency Principle (CEP) specifies the selection mechanism: representations exploiting unstable correlations pay a growing "exception tax" (approximately linear excess codelength under shortcut-flipping shifts), while representations encoding shift-stable invariants amortize this cost. When environments provide intervention-rich shifts and exhibit approximately modular causal structure, these invariants align with causal mechanisms. The framework offers a unified lens on three biological signatures -- steep metabolic constraints on neural signaling, high coding efficiency in early sensory pathways, and hierarchical tolerance in the ventral stream -- and ...