[2603.24254] Embracing Heteroscedasticity for Probabilistic Time Series Forecasting
About this article
Abstract page for arXiv paper 2603.24254: Embracing Heteroscedasticity for Probabilistic Time Series Forecasting
Computer Science > Machine Learning arXiv:2603.24254 (cs) [Submitted on 25 Mar 2026] Title:Embracing Heteroscedasticity for Probabilistic Time Series Forecasting Authors:Yijun Wang, Qiyuan Zhuang, Xiu-Shen Wei View a PDF of the paper titled Embracing Heteroscedasticity for Probabilistic Time Series Forecasting, by Yijun Wang and 2 other authors View PDF HTML (experimental) Abstract:Probabilistic time series forecasting (PTSF) aims to model the full predictive distribution of future observations, enabling both accurate forecasting and principled uncertainty quantification. A central requirement of PTSF is to embrace heteroscedasticity, as real-world time series exhibit time-varying conditional variances induced by nonstationary dynamics, regime changes, and evolving external conditions. However, most existing non-autoregressive generative approaches to PTSF, such as TimeVAE and $K^2$VAE, rely on MSE-based training objectives that implicitly impose a homoscedastic assumption, thereby fundamentally limiting their ability to model temporal heteroscedasticity. To address this limitation, we propose the Location-Scale Gaussian VAE (LSG-VAE), a simple but effective framework that explicitly parameterizes both the predictive mean and time-dependent variance through a location-scale likelihood formulation. This design enables LSG-VAE to faithfully capture heteroscedastic aleatoric uncertainty and introduces an adaptive attenuation mechanism that automatically down-weights highly vo...