[2604.05967] On Dominant Manifolds in Reservoir Computing Networks
About this article
Abstract page for arXiv paper 2604.05967: On Dominant Manifolds in Reservoir Computing Networks
Computer Science > Machine Learning arXiv:2604.05967 (cs) [Submitted on 7 Apr 2026] Title:On Dominant Manifolds in Reservoir Computing Networks Authors:Noa Kaplan, Alberto Padoan, Anastasia Bizyaeva View a PDF of the paper titled On Dominant Manifolds in Reservoir Computing Networks, by Noa Kaplan and 2 other authors View PDF Abstract:Understanding how training shapes the geometry of recurrent network dynamics is a central problem in time-series modeling. We study the emergence of low-dimensional dominant manifolds in the training of Reservoir Computing (RC) networks for temporal forecasting tasks. For a simplified linear and continuous-time reservoir model, we link the dimensionality and structure of the dominant modes directly to the intrinsic dimensionality and information content of the training data. In particular, for training data generated by an autonomous dynamical system, we relate the dominant modes of the trained reservoir to approximations of the Koopman eigenfunctions of the original system, illuminating an explicit connection between reservoir computing and the Dynamic Mode Decomposition algorithm. We illustrate the eigenvalue motion that generates the dominant manifolds during training in simulation, and discuss generalization to nonlinear RC via tangent dynamics and differential p-dominance. Comments: Subjects: Machine Learning (cs.LG); Dynamical Systems (math.DS); Optimization and Control (math.OC) MSC classes: 37N35 (Primary) 93C05 (Secondary) Cite as: a...