[2604.02601] WGFINNs: Weak formulation-based GENERIC formalism informed neural networks'
About this article
Abstract page for arXiv paper 2604.02601: WGFINNs: Weak formulation-based GENERIC formalism informed neural networks'
Computer Science > Machine Learning arXiv:2604.02601 (cs) [Submitted on 3 Apr 2026] Title:WGFINNs: Weak formulation-based GENERIC formalism informed neural networks' Authors:Jun Sur Richard Park, Auroni Huque Hashim, Siu Wun Cheung, Youngsoo Choi, Yeonjong Shin View a PDF of the paper titled WGFINNs: Weak formulation-based GENERIC formalism informed neural networks', by Jun Sur Richard Park and 4 other authors View PDF HTML (experimental) Abstract:Data-driven discovery of governing equations from noisy observations remains a fundamental challenge in scientific machine learning. While GENERIC formalism informed neural networks (GFINNs) provide a principled framework that enforces the laws of thermodynamics by construction, their reliance on strong-form loss formulations makes them highly sensitive to measurement noise. To address this limitation, we propose weak formulation-based GENERIC formalism informed neural networks (WGFINNs), which integrate the weak formulation of dynamical systems with the structure-preserving architecture of GFINNs. WGFINNs significantly enhance robustness to noisy data while retaining exact satisfaction of GENERIC degeneracy and symmetry conditions. We further incorporate a state-wise weighted loss and a residual-based attention mechanism to mitigate scale imbalance across state variables. Theoretical analysis contrasts quantitative differences between the strong-form and the weak-form estimators. Mainly, the strong-form estimator diverges as the...