[2506.02371] SFBD Flow: A Continuous-Optimization Framework for Training Diffusion Models with Noisy Samples
About this article
Abstract page for arXiv paper 2506.02371: SFBD Flow: A Continuous-Optimization Framework for Training Diffusion Models with Noisy Samples
Computer Science > Machine Learning arXiv:2506.02371 (cs) [Submitted on 3 Jun 2025 (v1), last revised 4 Apr 2026 (this version, v2)] Title:SFBD Flow: A Continuous-Optimization Framework for Training Diffusion Models with Noisy Samples Authors:Haoye Lu, Darren Lo, Yaoliang Yu View a PDF of the paper titled SFBD Flow: A Continuous-Optimization Framework for Training Diffusion Models with Noisy Samples, by Haoye Lu and 2 other authors View PDF HTML (experimental) Abstract:Diffusion models achieve strong generative performance but often rely on large datasets that may include sensitive content. This challenge is compounded by the models' tendency to memorize training data, raising privacy concerns. SFBD (Lu et al., 2025) addresses this by training on corrupted data and using limited clean samples to capture local structure and improve convergence. However, its iterative denoising and fine-tuning loop requires manual coordination, making it burdensome to implement. We reinterpret SFBD as an alternating projection algorithm and introduce a continuous variant, SFBD flow, that removes the need for alternating steps. We further show its connection to consistency constraint-based methods, and demonstrate that its practical instantiation, Online SFBD, consistently outperforms strong baselines across benchmarks. Subjects: Machine Learning (cs.LG) Cite as: arXiv:2506.02371 [cs.LG] (or arXiv:2506.02371v2 [cs.LG] for this version) https://doi.org/10.48550/arXiv.2506.02371 Focus to le...