[2501.12215] Automatic selection of the best neural architecture for time series forecasting
About this article
Abstract page for arXiv paper 2501.12215: Automatic selection of the best neural architecture for time series forecasting
Computer Science > Machine Learning arXiv:2501.12215 (cs) [Submitted on 21 Jan 2025 (v1), last revised 2 Apr 2026 (this version, v2)] Title:Automatic selection of the best neural architecture for time series forecasting Authors:Qianying Cao, Shanqing Liu, Alan John Varghese, Jerome Darbon, Michael Triantafyllou, George Em Karniadakis View a PDF of the paper titled Automatic selection of the best neural architecture for time series forecasting, by Qianying Cao and 5 other authors View PDF Abstract:Time series forecasting plays a pivotal role in a wide range of applications, including weather prediction, healthcare, structural health monitoring, predictive maintenance, energy systems, and financial markets. While models such as LSTM, GRU, Transformers, and State-Space Models (SSMs) have become standard tools in this domain, selecting the optimal architecture remains a challenge. Performance comparisons often depend on evaluation metrics and the datasets under analysis, making the choice of a universally optimal model controversial. In this work, we introduce a flexible automated framework for time series forecasting that systematically designs and evaluates diverse network architectures by integrating LSTM, GRU, multi-head Attention, and SSM blocks. Using a multi-objective optimization approach, our framework determines the number, sequence, and combination of blocks to align with specific requirements and evaluation objectives. From the resulting Pareto-optimal architecture...