[2601.00473] Deep Neural Networks as Discrete Dynamical Systems: Implications for Physics-Informed Learning

[2601.00473] Deep Neural Networks as Discrete Dynamical Systems: Implications for Physics-Informed Learning

arXiv - Machine Learning 4 min read

About this article

Abstract page for arXiv paper 2601.00473: Deep Neural Networks as Discrete Dynamical Systems: Implications for Physics-Informed Learning

Computer Science > Machine Learning arXiv:2601.00473 (cs) [Submitted on 1 Jan 2026 (v1), last revised 25 Mar 2026 (this version, v2)] Title:Deep Neural Networks as Discrete Dynamical Systems: Implications for Physics-Informed Learning Authors:Abhisek Ganguly, Santosh Ansumali, Sauro Succi View a PDF of the paper titled Deep Neural Networks as Discrete Dynamical Systems: Implications for Physics-Informed Learning, by Abhisek Ganguly and Santosh Ansumali and Sauro Succi View PDF HTML (experimental) Abstract:We revisit the analogy between feed-forward deep neural networks (DNNs) and discrete dynamical systems derived from neural integral equations and their corresponding partial differential equation (PDE) forms. A comparative analysis between the numerical/exact solutions of the Burgers' and Eikonal equations, and the same obtained via PINNs is presented. We show that PINN learning provides a different computational pathway compared to standard numerical discretization in approximating essentially the same underlying dynamics of the system. Within this framework, DNNs can be interpreted as discrete dynamical systems whose layer-wise evolution approaches attractors, and multiple parameter configurations may yield comparable solutions, reflecting the non-uniqueness of the inverse mapping. In contrast to the structured operators associated with finite-difference (FD) procedures, PINNs learn dense parameter representations that are not directly associated with classical discreti...

Originally published on March 26, 2026. Curated by AI News.

Related Articles

Machine Learning

[R] Are there ML approaches for prioritizing and routing “important” signals across complex systems?

I’ve been reading more about attention mechanisms in transformers and how they effectively learn to weight and prioritize relevant inputs...

Reddit - Machine Learning · 1 min ·
Llms

[P] I trained a language model from scratch for a low resource language and got it running fully on-device on Android (no GPU, demo)

Hi Everybody! I just wanted to share an update on a project I’ve been working on called BULaMU, a family of language models trained (20M,...

Reddit - Machine Learning · 1 min ·
Machine Learning

[R] Structure Over Scale: Memory-First Reasoning and Depth-Pruned Efficiency in Magnus and Seed Architecture Auto-Discovery

Dataset Model Acc F1 Δ vs Log Δ vs Static Avg Params Peak Params Steps Infer ms Size Banking77-20 Logistic TF-IDF 92.37% 0.9230 +0.00pp +...

Reddit - Machine Learning · 1 min ·
UM Computer Scientists Land Grant to Improve Models of Melting Greenland Glaciers
Machine Learning

UM Computer Scientists Land Grant to Improve Models of Melting Greenland Glaciers

Two UM researchers are using advanced neural networks, machine learning and artificial intelligence to improve climate models to better p...

AI News - General · 5 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime