[2504.04665] A Simultaneous Approach for Training Neural Differential-Algebraic Systems of Equations
About this article
Abstract page for arXiv paper 2504.04665: A Simultaneous Approach for Training Neural Differential-Algebraic Systems of Equations
Computer Science > Machine Learning arXiv:2504.04665 (cs) [Submitted on 7 Apr 2025 (v1), last revised 2 Apr 2026 (this version, v2)] Title:A Simultaneous Approach for Training Neural Differential-Algebraic Systems of Equations Authors:Laurens R. Lueg, Victor Alves, Daniel Schicksnus, John R. Kitchin, Carl D. Laird, Lorenz T. Biegler View a PDF of the paper titled A Simultaneous Approach for Training Neural Differential-Algebraic Systems of Equations, by Laurens R. Lueg and 5 other authors View PDF HTML (experimental) Abstract:Scientific machine learning is an emerging field that broadly describes the combination of scientific computing and machine learning to address challenges in science and engineering. Within the context of differential equations, this has produced highly influential methods, such as neural ordinary differential equations (NODEs). Recent works extend this line of research to consider neural differential-algebraic systems of equations (DAEs), where some unknown relationships within the DAE are learned from data. Training neural DAEs, similarly to neural ODEs, is computationally expensive, as it requires the solution of a DAE for every parameter update. Further, the rigorous consideration of algebraic constraints is difficult within common deep learning training algorithms such as stochastic gradient descent. In this work, we apply the simultaneous approach to neural DAE problems, resulting in a fully discretized nonlinear optimization problem, which is s...