[2508.02411] HGTS-Former: Hierarchical HyperGraph Transformer for Multivariate Time Series Analysis

[2508.02411] HGTS-Former: Hierarchical HyperGraph Transformer for Multivariate Time Series Analysis

arXiv - Machine Learning 4 min read

About this article

Abstract page for arXiv paper 2508.02411: HGTS-Former: Hierarchical HyperGraph Transformer for Multivariate Time Series Analysis

Computer Science > Computer Vision and Pattern Recognition arXiv:2508.02411 (cs) [Submitted on 4 Aug 2025 (v1), last revised 1 Mar 2026 (this version, v2)] Title:HGTS-Former: Hierarchical HyperGraph Transformer for Multivariate Time Series Analysis Authors:Hao Si, Xiao Wang, Fan Zhang, Xiaoya Zhou, Dengdi Sun, Wanli Lyu, Qingquan Yang, Jin Tang View a PDF of the paper titled HGTS-Former: Hierarchical HyperGraph Transformer for Multivariate Time Series Analysis, by Hao Si and 7 other authors View PDF HTML (experimental) Abstract:Multivariate time series analysis has long been one of the key research topics in the field of artificial intelligence. However, analyzing complex time series data remains a challenging and unresolved problem due to its high dimensionality, dynamic nature, and complex interactions among variables. Inspired by the strong structural modeling capability of hypergraphs, this paper proposes a novel hypergraph-based time series Transformer backbone network, termed HGTS-Former, to address the multivariate coupling in time series data. Specifically, given the multivariate time series signal, we first normalize and embed each patch into tokens. Then, we adopt the multi-head self-attention to enhance the temporal representation of each patch. The hierarchical hypergraphs are constructed to aggregate the temporal patterns within each channel and fine-grained relations between different variables. After that, we convert the hyperedge into node features through ...

Originally published on March 03, 2026. Curated by AI News.

Related Articles

Machine Learning

[D] Got my first offer after months of searching — below posted range, contract-to-hire, and worried it may pause my search. Do I take it?

I could really use some outside perspective. I’m a senior ML/CV engineer in Canada with about 5–6 years across research and industry. Mas...

Reddit - Machine Learning · 1 min ·
Machine Learning

[Research] AI training is bad, so I started an research

Hello, I started researching about AI training Q:Why? R: Because AI training is bad right now. Q: What do you mean its bad? R: Like when ...

Reddit - Machine Learning · 1 min ·
Machine Learning

[P] Unix philosophy for ML pipelines: modular, swappable stages with typed contracts

We built an open-source prototype that applies Unix philosophy to retrieval pipelines. Each stage (PII redaction, chunking, dedup, embedd...

Reddit - Machine Learning · 1 min ·
Machine Learning

Making an AI native sovereign computational stack

I’ve been working on a personal project that ended up becoming a kind of full computing stack: identity / trust protocol decentralized ch...

Reddit - Artificial Intelligence · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime