[2602.13421] Metabolic cost of information processing in Poisson variational autoencoders
Summary
This article explores the metabolic cost of information processing in Poisson variational autoencoders, emphasizing the energy constraints in biological systems and their implications for computational theories.
Why It Matters
Understanding the metabolic costs associated with information processing in neural networks is crucial for developing energy-efficient AI models. This research provides insights into how Poisson variational autoencoders can optimize coding fidelity while minimizing energy expenditure, which is increasingly relevant in the context of sustainable AI development.
Key Takeaways
- Poisson variational autoencoders (P-VAEs) offer a unique approach to energy-aware computation.
- The Kullback-Leibler divergence in P-VAEs links coding rates to neuronal firing rates, introducing a metabolic cost term.
- Increasing the KL term weighting coefficient in P-VAEs enhances sparsity and reduces spiking activity.
- The study contrasts P-VAEs with Gaussian VAEs, highlighting the specific benefits of Poisson statistics.
- This research lays the groundwork for a resource-constrained theory of computation in AI.
Statistics > Machine Learning arXiv:2602.13421 (stat) [Submitted on 13 Feb 2026] Title:Metabolic cost of information processing in Poisson variational autoencoders Authors:Hadi Vafaii, Jacob L. Yates View a PDF of the paper titled Metabolic cost of information processing in Poisson variational autoencoders, by Hadi Vafaii and 1 other authors View PDF HTML (experimental) Abstract:Computation in biological systems is fundamentally energy-constrained, yet standard theories of computation treat energy as freely available. Here, we argue that variational free energy minimization under a Poisson assumption offers a principled path toward an energy-aware theory of computation. Our key observation is that the Kullback-Leibler (KL) divergence term in the Poisson free energy objective becomes proportional to the prior firing rates of model neurons, yielding an emergent metabolic cost term that penalizes high baseline activity. This structure couples an abstract information-theoretic quantity -- the *coding rate* -- to a concrete biophysical variable -- the *firing rate* -- which enables a trade-off between coding fidelity and energy expenditure. Such a coupling arises naturally in the Poisson variational autoencoder (P-VAE) -- a brain-inspired generative model that encodes inputs as discrete spike counts and recovers a spiking form of *sparse coding* as a special case -- but is absent from standard Gaussian VAEs. To demonstrate that this metabolic cost structure is unique to the Poi...