[2604.02920] Efficient Logistic Regression with Mixture of Sigmoids
About this article
Abstract page for arXiv paper 2604.02920: Efficient Logistic Regression with Mixture of Sigmoids
Computer Science > Machine Learning arXiv:2604.02920 (cs) [Submitted on 3 Apr 2026] Title:Efficient Logistic Regression with Mixture of Sigmoids Authors:Federico Di Gennaro, Saptarshi Chakraborty, Nikita Zhivotovskiy View a PDF of the paper titled Efficient Logistic Regression with Mixture of Sigmoids, by Federico Di Gennaro and 2 other authors View PDF HTML (experimental) Abstract:This paper studies the Exponential Weights (EW) algorithm with an isotropic Gaussian prior for online logistic regression. We show that the near-optimal worst-case regret bound $O(d\log(Bn))$ for EW, established by Kakade and Ng (2005) against the best linear predictor of norm at most $B$, can be achieved with total worst-case computational complexity $O(B^3 n^5)$. This substantially improves on the $O(B^{18}n^{37})$ complexity of prior work achieving the same guarantee (Foster et al., 2018). Beyond efficiency, we analyze the large-$B$ regime under linear separability: after rescaling by $B$, the EW posterior converges as $B\to\infty$ to a standard Gaussian truncated to the version cone. Accordingly, the predictor converges to a solid-angle vote over separating directions and, on every fixed-margin slice of this cone, the mode of the corresponding truncated Gaussian is aligned with the hard-margin SVM direction. Using this geometry, we derive non-asymptotic regret bounds showing that once $B$ exceeds a margin-dependent threshold, the regret becomes independent of $B$ and grows only logarithmical...