[2508.02411] HGTS-Former: Hierarchical HyperGraph Transformer for Multivariate Time Series Analysis
About this article
Abstract page for arXiv paper 2508.02411: HGTS-Former: Hierarchical HyperGraph Transformer for Multivariate Time Series Analysis
Computer Science > Computer Vision and Pattern Recognition arXiv:2508.02411 (cs) [Submitted on 4 Aug 2025 (v1), last revised 1 Mar 2026 (this version, v2)] Title:HGTS-Former: Hierarchical HyperGraph Transformer for Multivariate Time Series Analysis Authors:Hao Si, Xiao Wang, Fan Zhang, Xiaoya Zhou, Dengdi Sun, Wanli Lyu, Qingquan Yang, Jin Tang View a PDF of the paper titled HGTS-Former: Hierarchical HyperGraph Transformer for Multivariate Time Series Analysis, by Hao Si and 7 other authors View PDF HTML (experimental) Abstract:Multivariate time series analysis has long been one of the key research topics in the field of artificial intelligence. However, analyzing complex time series data remains a challenging and unresolved problem due to its high dimensionality, dynamic nature, and complex interactions among variables. Inspired by the strong structural modeling capability of hypergraphs, this paper proposes a novel hypergraph-based time series Transformer backbone network, termed HGTS-Former, to address the multivariate coupling in time series data. Specifically, given the multivariate time series signal, we first normalize and embed each patch into tokens. Then, we adopt the multi-head self-attention to enhance the temporal representation of each patch. The hierarchical hypergraphs are constructed to aggregate the temporal patterns within each channel and fine-grained relations between different variables. After that, we convert the hyperedge into node features through ...