[2603.20696] High-dimensional online learning via asynchronous decomposition: Non-divergent results, dynamic regularization, and beyond
About this article
Abstract page for arXiv paper 2603.20696: High-dimensional online learning via asynchronous decomposition: Non-divergent results, dynamic regularization, and beyond
Statistics > Machine Learning arXiv:2603.20696 (stat) [Submitted on 21 Mar 2026] Title:High-dimensional online learning via asynchronous decomposition: Non-divergent results, dynamic regularization, and beyond Authors:Shixiang Liu, Zhifan Li, Hanming Yang, Jianxin Yin View a PDF of the paper titled High-dimensional online learning via asynchronous decomposition: Non-divergent results, dynamic regularization, and beyond, by Shixiang Liu and 3 other authors View PDF HTML (experimental) Abstract:Existing high-dimensional online learning methods often face the challenge that their error bounds, or per-batch sample sizes, diverge as the number of data batches increases. To address this issue, we propose an asynchronous decomposition framework that leverages summary statistics to construct a surrogate score function for current-batch learning. This framework is implemented via a dynamic-regularized iterative hard thresholding algorithm, providing a computationally and memory-efficient solution for sparse online optimization. We provide a unified theoretical analysis that accounts for both the streaming computational error and statistical accuracy, establishing that our estimator maintains non-divergent error bounds and $\ell_0$ sparsity across all batches. Furthermore, the proposed estimator adaptively achieves additional gains as batches accumulate, attaining the oracle accuracy as if the entire historical dataset were accessible and the true support were known. These theoretic...