[2603.24207] IPatch: A Multi-Resolution Transformer Architecture for Robust Time-Series Forecasting

[2603.24207] IPatch: A Multi-Resolution Transformer Architecture for Robust Time-Series Forecasting

arXiv - Machine Learning 3 min read

About this article

Abstract page for arXiv paper 2603.24207: IPatch: A Multi-Resolution Transformer Architecture for Robust Time-Series Forecasting

Computer Science > Machine Learning arXiv:2603.24207 (cs) [Submitted on 25 Mar 2026] Title:IPatch: A Multi-Resolution Transformer Architecture for Robust Time-Series Forecasting Authors:Aymane Harkati, Moncef Garouani, Olivier Teste, Julien Aligon, Mohamed Hamlich View a PDF of the paper titled IPatch: A Multi-Resolution Transformer Architecture for Robust Time-Series Forecasting, by Aymane Harkati and 4 other authors View PDF HTML (experimental) Abstract:Accurate forecasting of multivariate time series remains challenging due to the need to capture both short-term fluctuations and long-range temporal dependencies. Transformer-based models have emerged as a powerful approach, but their performance depends critically on the representation of temporal data. Traditional point-wise representations preserve individual time-step information, enabling fine-grained modeling, yet they tend to be computationally expensive and less effective at modeling broader contextual dependencies, limiting their scalability to long sequences. Patch-wise representations aggregate consecutive steps into compact tokens to improve efficiency and model local temporal dynamics, but they often discard fine-grained temporal details that are critical for accurate predictions in volatile or complex time series. We propose IPatch, a multi-resolution Transformer architecture that integrates both point-wise and patch-wise tokens, modeling temporal information at multiple resolutions. Experiments on 7 benchma...

Originally published on March 26, 2026. Curated by AI News.

Related Articles

UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
Llms

built an open source tool that auto generates AI context files for any codebase, 150 stars in

one of the most tedious parts of working with AI coding tools is having to manually write context files every single time. CLAUDE.md, .cu...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

[R] First open-source implementation of Hebbian fast-weight write-back for the BDH architecture

The BDH (Dragon Hatchling) paper (arXiv:2509.26507) describes a Hebbian synaptic plasticity mechanism where model weights update during i...

Reddit - Machine Learning · 1 min ·
Llms

[R] A language model built from the damped harmonic oscillator equation — no transformer blocks

I've been building a neural architecture where the only learnable transform is the transfer function of a damped harmonic oscillator: H(ω...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime