[2602.19172] Online Realizable Regression and Applications for ReLU Networks
Summary
This paper explores realizable online regression in adversarial settings, highlighting its differences from online classification and introducing a novel potential method to analyze cumulative loss bounds for ReLU networks.
Why It Matters
Understanding online realizable regression is crucial for advancing machine learning, particularly in adversarial environments. This research provides insights into loss bounds and effective dimensions, which can enhance the design of algorithms for neural networks, especially those using ReLU activation functions.
Key Takeaways
- Realizable online regression can achieve finite cumulative loss without margin assumptions.
- The proposed potential method offers a new way to analyze the scaled Littlestone/online dimension.
- Polynomial metric entropy leads to finite cumulative-loss bounds, enhancing algorithm performance.
- A sharp dichotomy is established between regression and classification for bounded-norm ReLU networks.
- The findings have implications for designing more efficient machine learning models.
Computer Science > Machine Learning arXiv:2602.19172 (cs) [Submitted on 22 Feb 2026] Title:Online Realizable Regression and Applications for ReLU Networks Authors:Ilan Doron-Arad, Idan Mehalel, Elchanan Mossel View a PDF of the paper titled Online Realizable Regression and Applications for ReLU Networks, by Ilan Doron-Arad and Idan Mehalel and Elchanan Mossel View PDF HTML (experimental) Abstract:Realizable online regression can behave very differently from online classification. Even without any margin or stochastic assumptions, realizability may enforce horizon-free (finite) cumulative loss under metric-like losses, even when the analogous classification problem has an infinite mistake bound. We study realizable online regression in the adversarial model under losses that satisfy an approximate triangle inequality (approximate pseudo-metrics). Recent work of Attias et al. shows that the minimax realizable cumulative loss is characterized by the scaled Littlestone/online dimension $\mathbb{D}_{\mathrm{onl}}$, but this quantity can be difficult to analyze. Our main contribution is a generic potential method that upper bounds $\mathbb{D}_{\mathrm{onl}}$ by a concrete Dudley-type entropy integral that depends only on covering numbers of the hypothesis class under the induced sup pseudo-metric. We define an \emph{entropy potential} $\Phi(\mathcal{H})=\int_{0}^{diam(\mathcal{H})} \log N(\mathcal{H},\varepsilon)\,d\varepsilon$, where $N(\mathcal{H},\varepsilon)$ is the $\vareps...