[2602.13421] Metabolic cost of information processing in Poisson variational autoencoders

[2602.13421] Metabolic cost of information processing in Poisson variational autoencoders

arXiv - AI 4 min read Article

Summary

This article explores the metabolic cost of information processing in Poisson variational autoencoders, emphasizing the energy constraints in biological systems and their implications for computational theories.

Why It Matters

Understanding the metabolic costs associated with information processing in neural networks is crucial for developing energy-efficient AI models. This research provides insights into how Poisson variational autoencoders can optimize coding fidelity while minimizing energy expenditure, which is increasingly relevant in the context of sustainable AI development.

Key Takeaways

  • Poisson variational autoencoders (P-VAEs) offer a unique approach to energy-aware computation.
  • The Kullback-Leibler divergence in P-VAEs links coding rates to neuronal firing rates, introducing a metabolic cost term.
  • Increasing the KL term weighting coefficient in P-VAEs enhances sparsity and reduces spiking activity.
  • The study contrasts P-VAEs with Gaussian VAEs, highlighting the specific benefits of Poisson statistics.
  • This research lays the groundwork for a resource-constrained theory of computation in AI.

Statistics > Machine Learning arXiv:2602.13421 (stat) [Submitted on 13 Feb 2026] Title:Metabolic cost of information processing in Poisson variational autoencoders Authors:Hadi Vafaii, Jacob L. Yates View a PDF of the paper titled Metabolic cost of information processing in Poisson variational autoencoders, by Hadi Vafaii and 1 other authors View PDF HTML (experimental) Abstract:Computation in biological systems is fundamentally energy-constrained, yet standard theories of computation treat energy as freely available. Here, we argue that variational free energy minimization under a Poisson assumption offers a principled path toward an energy-aware theory of computation. Our key observation is that the Kullback-Leibler (KL) divergence term in the Poisson free energy objective becomes proportional to the prior firing rates of model neurons, yielding an emergent metabolic cost term that penalizes high baseline activity. This structure couples an abstract information-theoretic quantity -- the *coding rate* -- to a concrete biophysical variable -- the *firing rate* -- which enables a trade-off between coding fidelity and energy expenditure. Such a coupling arises naturally in the Poisson variational autoencoder (P-VAE) -- a brain-inspired generative model that encodes inputs as discrete spike counts and recovers a spiking form of *sparse coding* as a special case -- but is absent from standard Gaussian VAEs. To demonstrate that this metabolic cost structure is unique to the Poi...

Related Articles

Google quietly launched an AI dictation app that works offline | TechCrunch
Machine Learning

Google quietly launched an AI dictation app that works offline | TechCrunch

Google's new offline-first dictation app uses Gemma AI models to take on the apps like Wispr Flow.

TechCrunch - AI · 4 min ·
UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
CONESTOGA COLLEGE Robots deepen AI and data analytics training for Conestoga students
Machine Learning

CONESTOGA COLLEGE Robots deepen AI and data analytics training for Conestoga students

AI News - General · 5 min ·
Alabama A&M University chosen for Amazon Web Services AI training program
Machine Learning

Alabama A&M University chosen for Amazon Web Services AI training program

AI News - General · 2 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime