[2602.17089] Synergizing Transport-Based Generative Models and Latent Geometry for Stochastic Closure Modeling

[2602.17089] Synergizing Transport-Based Generative Models and Latent Geometry for Stochastic Closure Modeling

arXiv - Machine Learning 4 min read Article

Summary

This article presents a novel approach to stochastic closure modeling by integrating transport-based generative models with latent geometry, significantly improving sampling speed and physical fidelity.

Why It Matters

The research addresses a critical limitation in generative AI, particularly in diffusion models, by enhancing sampling efficiency. This advancement could lead to broader applications in machine learning and dynamical systems, making it easier to model complex phenomena with less data.

Key Takeaways

  • Transport-based generative models can achieve faster sampling for stochastic closure models.
  • Flow matching in lower-dimensional latent spaces enhances efficiency, enabling single-step sampling.
  • Implicit and explicit regularization methods maintain physical fidelity and topological information.

Computer Science > Machine Learning arXiv:2602.17089 (cs) [Submitted on 19 Feb 2026] Title:Synergizing Transport-Based Generative Models and Latent Geometry for Stochastic Closure Modeling Authors:Xinghao Dong, Huchen Yang, Jin-long Wu View a PDF of the paper titled Synergizing Transport-Based Generative Models and Latent Geometry for Stochastic Closure Modeling, by Xinghao Dong and 2 other authors View PDF HTML (experimental) Abstract:Diffusion models recently developed for generative AI tasks can produce high-quality samples while still maintaining diversity among samples to promote mode coverage, providing a promising path for learning stochastic closure models. Compared to other types of generative AI models, such as GANs and VAEs, the sampling speed is known as a key disadvantage of diffusion models. By systematically comparing transport-based generative models on a numerical example of 2D Kolmogorov flows, we show that flow matching in a lower-dimensional latent space is suited for fast sampling of stochastic closure models, enabling single-step sampling that is up to two orders of magnitude faster than iterative diffusion-based approaches. To control the latent space distortion and thus ensure the physical fidelity of the sampled closure term, we compare the implicit regularization offered by a joint training scheme against two explicit regularizers: metric-preserving (MP) and geometry-aware (GA) constraints. Besides offering a faster sampling speed, both explicitly...

Related Articles

UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
Machine Learning

[D] Budget Machine Learning Hardware

Looking to get into machine learning and found this video on a piece of hardware for less than £500. Is it really possible to teach auton...

Reddit - Machine Learning · 1 min ·
Machine Learning

Your prompts aren’t the problem — something else is

I keep seeing people focus heavily on prompt optimization. But in practice, a lot of failures I’ve observed don’t come from the prompt it...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

[R], 31 MILLIONS High frequency data, Light GBM worked perfectly

We just published a paper on predicting adverse selection in high-frequency crypto markets using LightGBM, and I wanted to share it here ...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime