[2502.01383] InfoBridge: Mutual Information estimation via Bridge Matching
About this article
Abstract page for arXiv paper 2502.01383: InfoBridge: Mutual Information estimation via Bridge Matching
Computer Science > Machine Learning arXiv:2502.01383 (cs) [Submitted on 3 Feb 2025 (v1), last revised 27 Feb 2026 (this version, v4)] Title:InfoBridge: Mutual Information estimation via Bridge Matching Authors:Sergei Kholkin, Ivan Butakov, Evgeny Burnaev, Nikita Gushchin, Alexander Korotin View a PDF of the paper titled InfoBridge: Mutual Information estimation via Bridge Matching, by Sergei Kholkin and 4 other authors View PDF HTML (experimental) Abstract:Diffusion bridge models have recently become a powerful tool in the field of generative modeling. In this work, we leverage their power to address another important problem in machine learning and information theory, the estimation of the mutual information (MI) between two random variables. Neatly framing MI estimation as a domain transfer problem, we construct an unbiased estimator for data posing difficulties for conventional MI estimators. We showcase the performance of our estimator on three standard MI estimation benchmarks, i.e., low-dimensional, image-based and high MI, and on real-world data, i.e., protein language model embeddings. Subjects: Machine Learning (cs.LG); Machine Learning (stat.ML) Cite as: arXiv:2502.01383 [cs.LG] (or arXiv:2502.01383v4 [cs.LG] for this version) https://doi.org/10.48550/arXiv.2502.01383 Focus to learn more arXiv-issued DOI via DataCite Submission history From: Sergei Kholkin [view email] [v1] Mon, 3 Feb 2025 14:18:37 UTC (163 KB) [v2] Mon, 26 May 2025 15:35:24 UTC (1,386 KB) [v...