[2511.22693] Generative Anchored Fields: Controlled Data Generation via Emergent Velocity Fields and Transport Algebra
Summary
The paper introduces Generative Anchored Fields (GAF), a novel generative model that enhances data generation through controlled interpolation and compositional editing using emergent velocity fields and transport algebra.
Why It Matters
This research presents a significant advancement in generative modeling by allowing for more controlled and high-quality data generation. The introduction of Transport Algebra and the ability to manipulate multiple data domains could have broad implications for applications in machine learning, particularly in areas requiring nuanced data generation like computer vision and natural language processing.
Key Takeaways
- Generative Anchored Fields (GAF) enables controlled data generation from independent endpoint predictors.
- The model utilizes emergent velocity fields to enhance compositional control and interpolation.
- Achieves strong sample quality metrics (FID) on benchmark datasets like ImageNet and CelebA-HQ.
- Introduces Iterative Endpoint Refinement (IER) for high-quality generation in fewer steps.
- Facilitates semantic editing and multi-class composition in generative tasks.
Computer Science > Machine Learning arXiv:2511.22693 (cs) [Submitted on 27 Nov 2025 (v1), last revised 16 Feb 2026 (this version, v2)] Title:Generative Anchored Fields: Controlled Data Generation via Emergent Velocity Fields and Transport Algebra Authors:Deressa Wodajo Deressa, Hannes Mareen, Peter Lambert, Glenn Van Wallendael View a PDF of the paper titled Generative Anchored Fields: Controlled Data Generation via Emergent Velocity Fields and Transport Algebra, by Deressa Wodajo Deressa and 3 other authors View PDF HTML (experimental) Abstract:We present Generative Anchored Fields (GAF), a generative model that learns independent endpoint predictors, $J$ (noise) and $K$ (data), from any point on a linear bridge. Unlike existing approaches that use a single trajectory or score predictor, GAF is trained to recover the bridge endpoints directly via coordinate learning. The velocity field $v=K-J$ emerges from their time-conditioned disagreement. This factorization enables \textit{Transport Algebra}: algebraic operations on multiple $J/K$ heads for compositional control. With class-specific $K_n$ heads, GAF defines directed transport maps between a shared base noise distribution and multiple data domains, allowing controllable interpolation, multi-class composition, and semantic editing. This is achieved either directly on the predicted data coordinates ($K$) using Iterative Endpoint Refinement (IER), a novel sampler that achieves high-quality generation in $5-8$ steps, or on...