[2510.12453] Time-Correlated Video Bridge Matching
About this article
Abstract page for arXiv paper 2510.12453: Time-Correlated Video Bridge Matching
Computer Science > Machine Learning arXiv:2510.12453 (cs) [Submitted on 14 Oct 2025 (v1), last revised 26 Mar 2026 (this version, v2)] Title:Time-Correlated Video Bridge Matching Authors:Viacheslav Vasilev, Arseny Ivanov, Nikita Gushchin, Maria Kovaleva, Alexander Korotin View a PDF of the paper titled Time-Correlated Video Bridge Matching, by Viacheslav Vasilev and 4 other authors View PDF HTML (experimental) Abstract:Diffusion models excel in noise-to-data generation tasks, providing a mapping from a Gaussian distribution to a more complex data distribution. However they struggle to model translations between complex distributions, limiting their effectiveness in data-to-data tasks. While Bridge Matching models address this by finding the translation between data distributions, their application to time-correlated data sequences remains unexplored. This is a critical limitation for video generation and manipulation tasks, where maintaining temporal coherence is particularly important. To address this gap, we propose Time-Correlated Video Bridge Matching (TCVBM), a framework that extends BM to time-correlated data sequences in the video domain. TCVBM explicitly models inter-sequence dependencies within the diffusion bridge, directly incorporating temporal correlations into the sampling process. We compare our approach to classical methods based on bridge matching and diffusion models for three video-related tasks: frame interpolation, image-to-video generation, and video ...