[2603.24500] Project and Generate: Divergence-Free Neural Operators for Incompressible Flows
About this article
Abstract page for arXiv paper 2603.24500: Project and Generate: Divergence-Free Neural Operators for Incompressible Flows
Computer Science > Machine Learning arXiv:2603.24500 (cs) [Submitted on 25 Mar 2026] Title:Project and Generate: Divergence-Free Neural Operators for Incompressible Flows Authors:Xigui Li, Hongwei Zhang, Ruoxi Jiang, Deshu Chen, Chensen Lin, Limei Han, Yuan Qi, Xin Guo, Yuan Cheng View a PDF of the paper titled Project and Generate: Divergence-Free Neural Operators for Incompressible Flows, by Xigui Li and 8 other authors View PDF HTML (experimental) Abstract:Learning-based models for fluid dynamics often operate in unconstrained function spaces, leading to physically inadmissible, unstable simulations. While penalty-based methods offer soft regularization, they provide no structural guarantees, resulting in spurious divergence and long-term collapse. In this work, we introduce a unified framework that enforces the incompressible continuity equation as a hard, intrinsic constraint for both deterministic and generative modeling. First, to project deterministic models onto the divergence-free subspace, we integrate a differentiable spectral Leray projection grounded in the Helmholtz-Hodge decomposition, which restricts the regression hypothesis space to physically admissible velocity fields. Second, to generate physically consistent distributions, we show that simply projecting model outputs is insufficient when the prior is incompatible. To address this, we construct a divergence-free Gaussian reference measure via a curl-based pushforward, ensuring the entire probability f...