[2509.18131] Randomness and signal propagation in physics-informed neural networks (PINNs): A neural PDE perspective
Summary
This article investigates the randomness in weight matrices of physics-informed neural networks (PINNs) and its impact on signal propagation and stability, providing insights into the relationship between network architecture and numerical stability.
Why It Matters
Understanding how randomness in PINNs affects signal propagation is crucial for improving the stability and interpretability of neural networks. This research contributes to the broader field of machine learning by linking theoretical concepts from random matrix theory to practical applications in neural PDEs.
Key Takeaways
- PINNs exhibit weight matrices that behave randomly post-training, affecting signal propagation.
- The study connects random matrix theory with the dynamics of neural PDEs.
- Numerical stability in network architecture is critical for effective signal evolution.
- Explicitly unstable weight schemes can lead to degraded performance in signal propagation.
- Stable implicit and higher-order schemes enhance the dynamics of neural networks.
Computer Science > Machine Learning arXiv:2509.18131 (cs) [Submitted on 12 Sep 2025 (v1), last revised 16 Feb 2026 (this version, v2)] Title:Randomness and signal propagation in physics-informed neural networks (PINNs): A neural PDE perspective Authors:Jean-Michel Tucny, Abhisek Ganguly, Santosh Ansumali, Sauro Succi View a PDF of the paper titled Randomness and signal propagation in physics-informed neural networks (PINNs): A neural PDE perspective, by Jean-Michel Tucny and 3 other authors View PDF HTML (experimental) Abstract:Physics-informed neural networks (PINNs) often exhibit weight matrices that appear statistically random after training, yet their implications for signal propagation and stability remain unsatisfactorily understood, let alone the interpretability. In this work, we analyze the spectral and statistical properties of trained PINN weights using viscous and inviscid variants of the one-dimensional Burgers' equation, and show that the learned weights reside in a high-entropy regime consistent with predictions from random matrix theory. To investigate the dynamical consequences of such weight structures, we study the evolution of signal features inside a network through the lens of neural partial differential equations (neural PDEs). We show that random and structured weight matrices can be associated with specific discretizations of neural PDEs, and that the numerical stability of these discretizations governs the stability of signal propagation through t...