[2602.17773] Learning Flow Distributions via Projection-Constrained Diffusion on Manifolds

[2602.17773] Learning Flow Distributions via Projection-Constrained Diffusion on Manifolds

arXiv - Machine Learning 3 min read Article

Summary

The paper presents a novel generative modeling framework for synthesizing physically feasible two-dimensional incompressible flows, addressing limitations of existing diffusion-based models.

Why It Matters

This research is significant as it integrates physical constraints into generative models, enhancing their applicability in fields like robotics and scientific computing. By ensuring the generated flows are physically feasible, it opens new avenues for realistic simulations and applications in fluid dynamics.

Key Takeaways

  • Introduces a boundary-conditioned diffusion model for velocity fields.
  • Incorporates a physics-informed training objective with a divergence penalty.
  • Utilizes a projection-constrained reverse diffusion process to ensure incompressibility.
  • Demonstrates improved accuracy in flow generation compared to existing methods.
  • Provides a foundation for further research in generative modeling of incompressible fields.

Physics > Fluid Dynamics arXiv:2602.17773 (physics) [Submitted on 19 Feb 2026] Title:Learning Flow Distributions via Projection-Constrained Diffusion on Manifolds Authors:Noah Trupin, Rahul Ghosh, Aadi Jangid View a PDF of the paper titled Learning Flow Distributions via Projection-Constrained Diffusion on Manifolds, by Noah Trupin and 2 other authors View PDF HTML (experimental) Abstract:We present a generative modeling framework for synthesizing physically feasible two-dimensional incompressible flows under arbitrary obstacle geometries and boundary conditions. Whereas existing diffusion-based flow generators either ignore physical constraints, impose soft penalties that do not guarantee feasibility, or specialize to fixed geometries, our approach integrates three complementary components: (1) a boundary-conditioned diffusion model operating on velocity fields; (2) a physics-informed training objective incorporating a divergence penalty; and (3) a projection-constrained reverse diffusion process that enforces exact incompressibility through a geometry-aware Helmholtz-Hodge operator. We derive the method as a discrete approximation to constrained Langevin sampling on the manifold of divergence-free vector fields, providing a connection between modern diffusion models and geometric constraint enforcement in incompressible flow spaces. Experiments on analytic Navier-Stokes data and obstacle-bounded flow configurations demonstrate significantly improved divergence, spectral ...

Related Articles

Top 10 AI certifications and courses for 2026
Ai Startups

Top 10 AI certifications and courses for 2026

This article reviews the top 10 AI certifications and courses for 2026, highlighting their significance in a rapidly evolving field and t...

AI Events · 15 min ·
UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
Machine Learning

[P] MCGrad: fix calibration of your ML model in subgroups

Hi r/MachineLearning, We’re open-sourcing MCGrad, a Python package for multicalibration–developed and deployed in production at Meta. Thi...

Reddit - Machine Learning · 1 min ·
Machine Learning

Ml project user give dataset and I give best model [D] [P]

Tl,dr : suggest me a solution to create a ai ml project where user will give his dataset as input and the project should give best model ...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime