[2602.20307] In-context Pre-trained Time-Series Foundation Models adapt to Unseen Tasks
Summary
This paper presents In-Context Time-series Pre-training (ICTP), a framework that enhances time-series foundation models (TSFMs) with in-context learning capabilities, allowing them to adapt to unseen tasks without fine-tuning, resulting in improved performance by 11.4%.
Why It Matters
The research addresses a significant limitation in existing time-series models, which often fail to generalize to new tasks. By integrating in-context learning, this approach enhances the adaptability and utility of TSFMs in various applications, making it a valuable contribution to the field of machine learning.
Key Takeaways
- In-Context Learning (ICL) enhances TSFMs for unseen tasks.
- The proposed ICTP framework restructures pre-training data for better adaptability.
- Experiments show an 11.4% performance improvement on unseen tasks.
- No fine-tuning is required, simplifying deployment in real-world scenarios.
- This approach broadens the applicability of time-series models across diverse datasets.
Computer Science > Machine Learning arXiv:2602.20307 (cs) [Submitted on 23 Feb 2026] Title:In-context Pre-trained Time-Series Foundation Models adapt to Unseen Tasks Authors:Shangqing Xu, Harshavardhan Kamarthi, Haoxin Liu, B. Aditya Prakash View a PDF of the paper titled In-context Pre-trained Time-Series Foundation Models adapt to Unseen Tasks, by Shangqing Xu and 3 other authors View PDF HTML (experimental) Abstract:Time-series foundation models (TSFMs) have demonstrated strong generalization capabilities across diverse datasets and tasks. However, existing foundation models are typically pre-trained to enhance performance on specific tasks and often struggle to generalize to unseen tasks without fine-tuning. To address this limitation, we propose augmenting TSFMs with In-Context Learning (ICL) capabilities, enabling them to perform test-time inference by dynamically adapting to input-output relationships provided within the context. Our framework, In-Context Time-series Pre-training (ICTP), restructures the original pre-training data to equip the backbone TSFM with ICL capabilities, enabling adaptation to unseen tasks. Experiments demonstrate that ICT improves the performance of state-of-the-art TSFMs by approximately 11.4% on unseen tasks without requiring fine-tuning. Subjects: Machine Learning (cs.LG) Cite as: arXiv:2602.20307 [cs.LG] (or arXiv:2602.20307v1 [cs.LG] for this version) https://doi.org/10.48550/arXiv.2602.20307 Focus to learn more arXiv-issued DOI v...