[2602.19600] Manifold-Aligned Generative Transport
Summary
The paper presents Manifold-Aligned Generative Transport (MAGT), a novel generative model that efficiently samples from high-dimensional data by aligning with underlying manifolds, improving fidelity and speed compared to existing methods.
Why It Matters
This research addresses key challenges in generative modeling, particularly the balance between fidelity and sampling efficiency. By proposing MAGT, the authors provide a promising alternative to diffusion models and normalizing flows, potentially advancing applications in machine learning and data generation.
Key Takeaways
- MAGT learns a one-shot, manifold-aligned transport from a low-dimensional base distribution.
- The model improves sampling efficiency while maintaining high fidelity to the data manifold.
- Finite-sample Wasserstein bounds are established, linking smoothing level and score-approximation accuracy to generative fidelity.
- MAGT samples significantly faster than traditional diffusion models.
- Empirical results demonstrate improved fidelity and concentration near the learned support across various datasets.
Statistics > Machine Learning arXiv:2602.19600 (stat) [Submitted on 23 Feb 2026] Title:Manifold-Aligned Generative Transport Authors:Xinyu Tian, Xiaotong Shen View a PDF of the paper titled Manifold-Aligned Generative Transport, by Xinyu Tian and Xiaotong Shen View PDF HTML (experimental) Abstract:High-dimensional generative modeling is fundamentally a manifold-learning problem: real data concentrate near a low-dimensional structure embedded in the ambient space. Effective generators must therefore balance support fidelity -- placing probability mass near the data manifold -- with sampling efficiency. Diffusion models often capture near-manifold structure but require many iterative denoising steps and can leak off-support; normalizing flows sample in one pass but are limited by invertibility and dimension preservation. We propose MAGT (Manifold-Aligned Generative Transport), a flow-like generator that learns a one-shot, manifold-aligned transport from a low-dimensional base distribution to the data space. Training is performed at a fixed Gaussian smoothing level, where the score is well-defined and numerically stable. We approximate this fixed-level score using a finite set of latent anchor points with self-normalized importance sampling, yielding a tractable objective. MAGT samples in a single forward pass, concentrates probability near the learned support, and induces an intrinsic density with respect to the manifold volume measure, enabling principled likelihood evaluat...