[2511.16145] Labels Matter More Than Models: Rethinking the Unsupervised Paradigm in Time Series Anomaly Detection
About this article
Abstract page for arXiv paper 2511.16145: Labels Matter More Than Models: Rethinking the Unsupervised Paradigm in Time Series Anomaly Detection
Computer Science > Machine Learning arXiv:2511.16145 (cs) [Submitted on 20 Nov 2025 (v1), last revised 2 Apr 2026 (this version, v2)] Title:Labels Matter More Than Models: Rethinking the Unsupervised Paradigm in Time Series Anomaly Detection Authors:Zhijie Zhong, Zhiwen Yu, Kaixiang Yang, Yongheng Liu, Jun Jiang, C. L. Philip Chen View a PDF of the paper titled Labels Matter More Than Models: Rethinking the Unsupervised Paradigm in Time Series Anomaly Detection, by Zhijie Zhong and 5 other authors View PDF HTML (experimental) Abstract:Time series anomaly detection (TSAD) is a critical data mining task often constrained by label scarcity. Consequently, current research predominantly focuses on Unsupervised Time-series Anomaly Detection (UTAD), relying on increasingly complex architectures to model normal data distributions. However, this algorithm-centric trend often overlooks the significant performance gains achievable from limited anomaly labels available in practical scenarios. This paper challenges the premise that algorithmic complexity is the optimal path for TSAD. Instead of proposing another intricate unsupervised model, we present a comprehensive benchmark and empirical study to rigorously compare supervised and unsupervised paradigms. To isolate the value of labels, we introduce \stand, a deliberately minimalist supervised baseline. Extensive experiments on five public datasets demonstrate that: (1) Labels matter more than models: under a limited labeling budget,...