[2602.21429] Provably Safe Generative Sampling with Constricting Barrier Functions
Summary
This paper presents a safety filtering framework for generative models, ensuring generated samples meet hard constraints while minimizing disruption to the model's learned structure.
Why It Matters
As generative models are increasingly applied in safety-critical domains, ensuring that generated outputs adhere to safety constraints is crucial. This research provides a formal mechanism to guarantee safe sampling, which is essential for applications in robotics and other fields where safety is paramount.
Key Takeaways
- Introduces a safety filtering framework for generative models.
- Utilizes Control Barrier Functions to ensure safety during sampling.
- Achieves 100% constraint satisfaction while maintaining semantic fidelity.
- Applies to various domains, including image generation and robotic manipulation.
- No retraining or architectural changes are needed for existing models.
Computer Science > Machine Learning arXiv:2602.21429 (cs) [Submitted on 24 Feb 2026] Title:Provably Safe Generative Sampling with Constricting Barrier Functions Authors:Darshan Gadginmath, Ahmed Allibhoy, Fabio Pasqualetti View a PDF of the paper titled Provably Safe Generative Sampling with Constricting Barrier Functions, by Darshan Gadginmath and 2 other authors View PDF HTML (experimental) Abstract:Flow-based generative models, such as diffusion models and flow matching models, have achieved remarkable success in learning complex data distributions. However, a critical gap remains for their deployment in safety-critical domains: the lack of formal guarantees that generated samples will satisfy hard constraints. We address this by proposing a safety filtering framework that acts as an online shield for any pre-trained generative model. Our key insight is to cooperate with the generative process rather than override it. We define a constricting safety tube that is relaxed at the initial noise distribution and progressively tightens to the target safe set at the final data distribution, mirroring the coarse-to-fine structure of the generative process itself. By characterizing this tube via Control Barrier Functions (CBFs), we synthesize a feedback control input through a convex Quadratic Program (QP) at each sampling step. As the tube is loosest when noise is high and intervention is cheapest in terms of control energy, most constraint enforcement occurs when it least disr...