[2601.00473] Deep Neural Networks as Discrete Dynamical Systems: Implications for Physics-Informed Learning
About this article
Abstract page for arXiv paper 2601.00473: Deep Neural Networks as Discrete Dynamical Systems: Implications for Physics-Informed Learning
Computer Science > Machine Learning arXiv:2601.00473 (cs) [Submitted on 1 Jan 2026 (v1), last revised 25 Mar 2026 (this version, v2)] Title:Deep Neural Networks as Discrete Dynamical Systems: Implications for Physics-Informed Learning Authors:Abhisek Ganguly, Santosh Ansumali, Sauro Succi View a PDF of the paper titled Deep Neural Networks as Discrete Dynamical Systems: Implications for Physics-Informed Learning, by Abhisek Ganguly and Santosh Ansumali and Sauro Succi View PDF HTML (experimental) Abstract:We revisit the analogy between feed-forward deep neural networks (DNNs) and discrete dynamical systems derived from neural integral equations and their corresponding partial differential equation (PDE) forms. A comparative analysis between the numerical/exact solutions of the Burgers' and Eikonal equations, and the same obtained via PINNs is presented. We show that PINN learning provides a different computational pathway compared to standard numerical discretization in approximating essentially the same underlying dynamics of the system. Within this framework, DNNs can be interpreted as discrete dynamical systems whose layer-wise evolution approaches attractors, and multiple parameter configurations may yield comparable solutions, reflecting the non-uniqueness of the inverse mapping. In contrast to the structured operators associated with finite-difference (FD) procedures, PINNs learn dense parameter representations that are not directly associated with classical discreti...