[2603.24207] IPatch: A Multi-Resolution Transformer Architecture for Robust Time-Series Forecasting
About this article
Abstract page for arXiv paper 2603.24207: IPatch: A Multi-Resolution Transformer Architecture for Robust Time-Series Forecasting
Computer Science > Machine Learning arXiv:2603.24207 (cs) [Submitted on 25 Mar 2026] Title:IPatch: A Multi-Resolution Transformer Architecture for Robust Time-Series Forecasting Authors:Aymane Harkati, Moncef Garouani, Olivier Teste, Julien Aligon, Mohamed Hamlich View a PDF of the paper titled IPatch: A Multi-Resolution Transformer Architecture for Robust Time-Series Forecasting, by Aymane Harkati and 4 other authors View PDF HTML (experimental) Abstract:Accurate forecasting of multivariate time series remains challenging due to the need to capture both short-term fluctuations and long-range temporal dependencies. Transformer-based models have emerged as a powerful approach, but their performance depends critically on the representation of temporal data. Traditional point-wise representations preserve individual time-step information, enabling fine-grained modeling, yet they tend to be computationally expensive and less effective at modeling broader contextual dependencies, limiting their scalability to long sequences. Patch-wise representations aggregate consecutive steps into compact tokens to improve efficiency and model local temporal dynamics, but they often discard fine-grained temporal details that are critical for accurate predictions in volatile or complex time series. We propose IPatch, a multi-resolution Transformer architecture that integrates both point-wise and patch-wise tokens, modeling temporal information at multiple resolutions. Experiments on 7 benchma...