[2603.03405] Surprisal-Rényi Free Energy
About this article
Abstract page for arXiv paper 2603.03405: Surprisal-Rényi Free Energy
Statistics > Machine Learning arXiv:2603.03405 (stat) [Submitted on 3 Mar 2026] Title:Surprisal-Rényi Free Energy Authors:Shion Matsumoto, Raul Castillo, Benjamin Prada, Ankur Arjun Mali View a PDF of the paper titled Surprisal-R\'enyi Free Energy, by Shion Matsumoto and 3 other authors View PDF HTML (experimental) Abstract:The forward and reverse Kullback-Leibler (KL) divergences arise as limiting objectives in learning and inference yet induce markedly different inductive biases that cannot be explained at the level of expectations alone. In this work, we introduce the Surprisal-Rényi Free Energy (SRFE), a log-moment-based functional of the likelihood ratio that lies outside the class of $f$-divergences. We show that SRFE recovers forward and reverse KL divergences as singular endpoint limits and derive local expansions around both limits in which the variance of the log-likelihood ratio appears as a first-order correction. This reveals an explicit mean-variance tradeoff governing departures from KL-dominated regimes. We further establish a Gibbs-type variational characterization of SRFE as the unique minimizer of a weighted sum of KL divergences and prove that SRFE directly controls large deviations of excess code-length via Chernoff-type bounds, yielding a precise Minimum Description Length interpretation. Together, these results identify SRFE as a variance- and tail-sensitive free-energy functional that clarifies the geometric and large-deviation structure underlying ...