[2602.20307] In-context Pre-trained Time-Series Foundation Models adapt to Unseen Tasks

[2602.20307] In-context Pre-trained Time-Series Foundation Models adapt to Unseen Tasks

arXiv - Machine Learning 3 min read Article

Summary

This paper presents In-Context Time-series Pre-training (ICTP), a framework that enhances time-series foundation models (TSFMs) with in-context learning capabilities, allowing them to adapt to unseen tasks without fine-tuning, resulting in improved performance by 11.4%.

Why It Matters

The research addresses a significant limitation in existing time-series models, which often fail to generalize to new tasks. By integrating in-context learning, this approach enhances the adaptability and utility of TSFMs in various applications, making it a valuable contribution to the field of machine learning.

Key Takeaways

  • In-Context Learning (ICL) enhances TSFMs for unseen tasks.
  • The proposed ICTP framework restructures pre-training data for better adaptability.
  • Experiments show an 11.4% performance improvement on unseen tasks.
  • No fine-tuning is required, simplifying deployment in real-world scenarios.
  • This approach broadens the applicability of time-series models across diverse datasets.

Computer Science > Machine Learning arXiv:2602.20307 (cs) [Submitted on 23 Feb 2026] Title:In-context Pre-trained Time-Series Foundation Models adapt to Unseen Tasks Authors:Shangqing Xu, Harshavardhan Kamarthi, Haoxin Liu, B. Aditya Prakash View a PDF of the paper titled In-context Pre-trained Time-Series Foundation Models adapt to Unseen Tasks, by Shangqing Xu and 3 other authors View PDF HTML (experimental) Abstract:Time-series foundation models (TSFMs) have demonstrated strong generalization capabilities across diverse datasets and tasks. However, existing foundation models are typically pre-trained to enhance performance on specific tasks and often struggle to generalize to unseen tasks without fine-tuning. To address this limitation, we propose augmenting TSFMs with In-Context Learning (ICL) capabilities, enabling them to perform test-time inference by dynamically adapting to input-output relationships provided within the context. Our framework, In-Context Time-series Pre-training (ICTP), restructures the original pre-training data to equip the backbone TSFM with ICL capabilities, enabling adaptation to unseen tasks. Experiments demonstrate that ICT improves the performance of state-of-the-art TSFMs by approximately 11.4% on unseen tasks without requiring fine-tuning. Subjects: Machine Learning (cs.LG) Cite as: arXiv:2602.20307 [cs.LG]   (or arXiv:2602.20307v1 [cs.LG] for this version)   https://doi.org/10.48550/arXiv.2602.20307 Focus to learn more arXiv-issued DOI v...

Related Articles

Anthropic’s Unreleased Claude Mythos Might Be The Most Advanced AI Model Yet
Llms

Anthropic’s Unreleased Claude Mythos Might Be The Most Advanced AI Model Yet

Anthropic is testing an unreleased artificial intelligence (AI) model with capabilities that exceed any system it has previously released...

AI Tools & Products · 5 min ·
Anthropic leaks part of Claude Code's internal source code
Llms

Anthropic leaks part of Claude Code's internal source code

Claude Code has seen massive adoption over the last year, and its run-rate revenue had swelled to more than $2.5 billion as of February.

AI Tools & Products · 3 min ·
Australian government and Anthropic sign MOU for AI safety and research
Llms

Australian government and Anthropic sign MOU for AI safety and research

Anthropic is an AI safety and research company that's working to build reliable, interpretable, and steerable AI systems.

AI Tools & Products · 5 min ·
Penguin to sue OpenAI over ChatGPT version of German children’s book
Llms

Penguin to sue OpenAI over ChatGPT version of German children’s book

Publisher alleges AI research company’s chatbot violated its copyright over Coconut the Little Dragon series

AI Tools & Products · 3 min ·
More in Llms: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime