[2505.12412] Training Latent Diffusion Models with Interacting Particle Algorithms
About this article
Abstract page for arXiv paper 2505.12412: Training Latent Diffusion Models with Interacting Particle Algorithms
Statistics > Machine Learning arXiv:2505.12412 (stat) [Submitted on 18 May 2025 (v1), last revised 29 Mar 2026 (this version, v3)] Title:Training Latent Diffusion Models with Interacting Particle Algorithms Authors:Tim Y. J. Wang, Juan Kuntz, O. Deniz Akyildiz View a PDF of the paper titled Training Latent Diffusion Models with Interacting Particle Algorithms, by Tim Y. J. Wang and 2 other authors View PDF Abstract:We introduce a novel particle-based algorithm for end-to-end training of latent diffusion models. We reformulate the training task as minimizing a free energy functional and obtain a gradient flow that does so. By approximating the latter with a system of interacting particles, we obtain the algorithm, which we underpin theoretically by providing error guarantees. The novel algorithm compares favorably in experiments with previous particle-based methods and variational inference analogues. Comments: Subjects: Machine Learning (stat.ML); Machine Learning (cs.LG) Cite as: arXiv:2505.12412 [stat.ML] (or arXiv:2505.12412v3 [stat.ML] for this version) https://doi.org/10.48550/arXiv.2505.12412 Focus to learn more arXiv-issued DOI via DataCite Submission history From: Tim Y. J. Wang [view email] [v1] Sun, 18 May 2025 13:29:07 UTC (7,102 KB) [v2] Fri, 23 May 2025 18:19:28 UTC (7,092 KB) [v3] Sun, 29 Mar 2026 12:10:08 UTC (6,840 KB) Full-text links: Access Paper: View a PDF of the paper titled Training Latent Diffusion Models with Interacting Particle Algorithms, by ...