[2603.04625] K-Means as a Radial Basis function Network: a Variational and Gradient-based Equivalence
About this article
Abstract page for arXiv paper 2603.04625: K-Means as a Radial Basis function Network: a Variational and Gradient-based Equivalence
Computer Science > Machine Learning arXiv:2603.04625 (cs) [Submitted on 4 Mar 2026] Title:K-Means as a Radial Basis function Network: a Variational and Gradient-based Equivalence Authors:Felipe de Jesus Felix Arredondo, Alejandro Ucan-Puc, Carlos Astengo Noguez View a PDF of the paper titled K-Means as a Radial Basis function Network: a Variational and Gradient-based Equivalence, by Felipe de Jesus Felix Arredondo and Alejandro Ucan-Puc and Carlos Astengo Noguez View PDF HTML (experimental) Abstract:This work establishes a rigorous variational and gradient-based equivalence between the classical K-Means algorithm and differentiable Radial Basis Function (RBF) neural networks with smooth responsibilities. By reparameterizing the K-Means objective and embedding its distortion functional into a smooth weighted loss, we prove that the RBF objective $\Gamma$-converges to the K-Means solution as the temperature parameter $\sigma$ vanishes. We further demonstrate that the gradient-based updates of the RBF centers recover the exact K-Means centroid update rule and induce identical training trajectories in the limit. To address the numerical instability of the Softmax transformation in the low-temperature regime, we propose the integration of Entmax-1.5, which ensures stable polynomial convergence while preserving the underlying Voronoi partition structure. These results bridge the conceptual gap between discrete partitioning and continuous optimization, enabling K-Means to be embe...