[2603.00588] Energy-Efficient Information Representation in MNIST Classification Using Biologically Inspired Learning
About this article
Abstract page for arXiv paper 2603.00588: Energy-Efficient Information Representation in MNIST Classification Using Biologically Inspired Learning
Computer Science > Machine Learning arXiv:2603.00588 (cs) [Submitted on 28 Feb 2026] Title:Energy-Efficient Information Representation in MNIST Classification Using Biologically Inspired Learning Authors:Patrick Stricker, Florian Röhrbein, Andreas Knoblauch View a PDF of the paper titled Energy-Efficient Information Representation in MNIST Classification Using Biologically Inspired Learning, by Patrick Stricker and 2 other authors View PDF HTML (experimental) Abstract:Efficient representation learning is essential for optimal information storage and classification. However, it is frequently overlooked in artificial neural networks (ANNs). This neglect results in networks that can become overparameterized by factors of up to 13, increasing redundancy and energy consumption. As the demand for large language models (LLMs) and their scale increase, these issues are further highlighted, raising significant ethical and environmental concerns. We analyze our previously developed biologically inspired learning rule using information-theoretic concepts, evaluating its efficiency on the MNIST classification task. The proposed rule, which emulates the brain's structural plasticity, naturally prevents overparameterization by optimizing synaptic usage and retaining only the essential number of synapses. Furthermore, it outperforms backpropagation (BP) in terms of efficiency and storage capacity. It also eliminates the need for pre-optimization of network architecture, enhances adaptabi...