[2603.02337] Preconditioned Score and Flow Matching
About this article
Abstract page for arXiv paper 2603.02337: Preconditioned Score and Flow Matching
Computer Science > Machine Learning arXiv:2603.02337 (cs) [Submitted on 2 Mar 2026] Title:Preconditioned Score and Flow Matching Authors:Shadab Ahamed, Eshed Gal, Simon Ghyselincks, Md Shahriar Rahim Siddiqui, Moshe Eliasof, Eldad Haber View a PDF of the paper titled Preconditioned Score and Flow Matching, by Shadab Ahamed and 5 other authors View PDF HTML (experimental) Abstract:Flow matching and score-based diffusion train vector fields under intermediate distributions $p_t$, whose geometry can strongly affect their optimization. We show that the covariance $\Sigma_t$ of $p_t$ governs optimization bias: when $\Sigma_t$ is ill-conditioned, and gradient-based training rapidly fits high-variance directions while systematically under-optimizing low-variance modes, leading to learning that plateaus at suboptimal weights. We formalize this effect in analytically tractable settings and propose reversible, label-conditional \emph{preconditioning} maps that reshape the geometry of $p_t$ by improving the conditioning of $\Sigma_t$ without altering the underlying generative model. Rather than accelerating early convergence, preconditioning primarily mitigates optimization stagnation by enabling continued progress along previously suppressed directions. Across MNIST latent flow matching, and additional high-resolution datasets, we empirically track conditioning diagnostics and distributional metrics and show that preconditioning consistently yields better-trained models by avoiding ...