[2508.06066] Effective Sample Size and Generalization Bounds for Temporal Networks
About this article
Abstract page for arXiv paper 2508.06066: Effective Sample Size and Generalization Bounds for Temporal Networks
Computer Science > Machine Learning arXiv:2508.06066 (cs) [Submitted on 8 Aug 2025 (v1), last revised 3 Mar 2026 (this version, v3)] Title:Effective Sample Size and Generalization Bounds for Temporal Networks Authors:Barak Gahtan, Alex M. Bronstein View a PDF of the paper titled Effective Sample Size and Generalization Bounds for Temporal Networks, by Barak Gahtan and 1 other authors View PDF HTML (experimental) Abstract:Learning from time series is fundamentally different from learning from i.i.d.\ data: temporal dependence can make long sequences effectively information-poor, yet standard evaluation protocols conflate sequence length with statistical information. We propose a dependence-aware evaluation methodology that controls for effective sample size $N_{\text{eff}}$ rather than raw length $N$, and provide end-to-end generalization guarantees for Temporal Convolutional Networks (TCNs) on $\beta$-mixing sequences. Our analysis combines a blocking/coupling reduction that extracts $B = \Theta(N/\log N)$ approximately independent anchors with an architecture-aware Rademacher bound for $\ell_{2,1}$-norm-controlled convolutional networks, yielding $O(\sqrt{D\log p / B})$ complexity scaling in depth $D$ and kernel size $p$. Empirically, we find that stronger temporal dependence can \emph{reduce} generalization gaps when comparisons control for $N_{\text{eff}}$ - a conclusion that reverses under standard fixed-$N$ evaluation, with observed rates of $N_{\text{eff}}^{-0.9}$ to...