[2604.05374] LMI-Net: Linear Matrix Inequality--Constrained Neural Networks via Differentiable Projection Layers
About this article
Abstract page for arXiv paper 2604.05374: LMI-Net: Linear Matrix Inequality--Constrained Neural Networks via Differentiable Projection Layers
Computer Science > Machine Learning arXiv:2604.05374 (cs) [Submitted on 7 Apr 2026] Title:LMI-Net: Linear Matrix Inequality--Constrained Neural Networks via Differentiable Projection Layers Authors:Sunbochen Tang, Andrea Goertzen, Navid Azizan View a PDF of the paper titled LMI-Net: Linear Matrix Inequality--Constrained Neural Networks via Differentiable Projection Layers, by Sunbochen Tang and 2 other authors View PDF HTML (experimental) Abstract:Linear matrix inequalities (LMIs) have played a central role in certifying stability, robustness, and forward invariance of dynamical systems. Despite rapid development in learning-based methods for control design and certificate synthesis, existing approaches often fail to preserve the hard matrix inequality constraints required for formal guarantees. We propose LMI-Net, an efficient and modular differentiable projection layer that enforces LMI constraints by construction. Our approach lifts the set defined by LMI constraints into the intersection of an affine equality constraint and the positive semidefinite cone, performs the forward pass via Douglas-Rachford splitting, and supports efficient backward propagation through implicit differentiation. We establish theoretical guarantees that the projection layer converges to a feasible point, certifying that LMI-Net transforms a generic neural network into a reliable model satisfying LMI constraints. Evaluated on experiments including invariant ellipsoid synthesis and joint control...