[2512.01906] Delays in Spiking Neural Networks: A State Space Model Approach
About this article
Abstract page for arXiv paper 2512.01906: Delays in Spiking Neural Networks: A State Space Model Approach
Computer Science > Machine Learning arXiv:2512.01906 (cs) [Submitted on 1 Dec 2025 (v1), last revised 26 Mar 2026 (this version, v2)] Title:Delays in Spiking Neural Networks: A State Space Model Approach Authors:Sanja Karilanova, Subhrakanti Dey, Ayça Özçelikkale View a PDF of the paper titled Delays in Spiking Neural Networks: A State Space Model Approach, by Sanja Karilanova and 2 other authors View PDF HTML (experimental) Abstract:Spiking neural networks (SNNs) are biologically inspired, event-driven models suited for temporal data processing and energy-efficient neuromorphic computing. In SNNs, richer neuronal dynamic allows capturing more complex temporal dependencies, with delays playing a crucial role by allowing past inputs to directly influence present spiking behavior. We propose a general framework for incorporating delays into SNNs through additional state variables. The proposed mechanism enables each neuron to access a finite temporal input history. The framework is agnostic to neuron models and hence can be seamlessly integrated into standard spiking neuron models such as Leaky Integrate-and-Fire (LIF) and Adaptive LIF (adLIF). We analyze how the duration of the delays and the learnable parameters associated with them affect the performance. We investigate the trade-offs in the network architecture due to additional state variables introduced by the delay mechanism. Experiments on the Spiking Heidelberg Digits (SHD) dataset show that the proposed mechanism m...