[2603.02840] Adapting Time Series Foundation Models through Data Mixtures
About this article
Abstract page for arXiv paper 2603.02840: Adapting Time Series Foundation Models through Data Mixtures
Computer Science > Machine Learning arXiv:2603.02840 (cs) [Submitted on 3 Mar 2026] Title:Adapting Time Series Foundation Models through Data Mixtures Authors:Thomas L. Lee, Edoardo M. Ponti, Amos Storkey View a PDF of the paper titled Adapting Time Series Foundation Models through Data Mixtures, by Thomas L. Lee and Edoardo M. Ponti and Amos Storkey View PDF HTML (experimental) Abstract:Time series foundation models (TSFMs) have become increasingly popular for zero-shot forecasting. However, for a new time series domain not fully covered by the pretraining set, performance can suffer. Therefore, when a practitioner cares about a new domain and has access to a set of related datasets, the question arises: how best to fine-tune a TSFM to improve zero-shot forecasting? A typical approach to this type of problem is to fine-tune a LoRA module on all datasets or separately on each dataset. Tuning a separate module on each dataset allows for the specialisation of the TSFM to different types of data distribution, by selecting differing combinations of per-dataset modules for different time series contexts. However, we find that, using per-dataset modules might not be optimal, since a time series dataset can contain data from several types of distributions, i.e. sub-domains. This can be due to the distribution shifting or having differing distributions for different dimensions of the time series. Hence, we propose MixFT which re-divides the data using Bayesian mixtures into sets t...