[2301.01741] Graph State-Space Models and Latent Relational Inference
About this article
Abstract page for arXiv paper 2301.01741: Graph State-Space Models and Latent Relational Inference
Computer Science > Machine Learning arXiv:2301.01741 (cs) [Submitted on 4 Jan 2023 (v1), last revised 5 Apr 2026 (this version, v2)] Title:Graph State-Space Models and Latent Relational Inference Authors:Daniele Zambon, Andrea Cini, Cesare Alippi View a PDF of the paper titled Graph State-Space Models and Latent Relational Inference, by Daniele Zambon and 2 other authors View PDF HTML (experimental) Abstract:State-space models effectively model multivariate time series by updating over time a representation of the system state from which predictions are made. The state representation is usually a vector without any explicit structure. Relational inductive biases, e.g., associated with dependencies among input signals and state representations, are not explicitly exploited during processing, leaving unattended opportunities for effective modeling. The manuscript aims to fill this gap by matching state-space modeling and spatio-temporal data where the relational information, say the functional graph capturing latent dependencies, is learned directly from time series. In particular, we propose Graph State-Space Models, a novel probabilistic framework that jointly learns state-space dynamics and latent relational structures end-to-end on downstream tasks. The proposed framework generalizes several state-of-the-art methods and, as we show, is effective in extracting meaningful latent relational structures and obtaining accurate forecasts. Subjects: Machine Learning (cs.LG) Cite...