[2602.18679] Transformers for dynamical systems learn transfer operators in-context

[2602.18679] Transformers for dynamical systems learn transfer operators in-context

arXiv - Machine Learning 3 min read Article

Summary

This article explores how transformers can learn transfer operators for dynamical systems through in-context learning, enabling zero-shot forecasting of unseen systems.

Why It Matters

Understanding how large-scale foundation models adapt to new physical systems without retraining is crucial for advancing machine learning applications in scientific domains. This research provides insights into the mechanisms that allow for effective forecasting in complex dynamical systems, which can impact fields such as climate modeling, robotics, and beyond.

Key Takeaways

  • Transformers can forecast different dynamical systems without retraining.
  • In-context learning reveals a tradeoff between in-distribution and out-of-distribution performance.
  • Attention-based models utilize transfer-operator strategies for effective forecasting.
  • The study highlights the importance of global attractor information in short-term predictions.
  • This research challenges conventional learning paradigms in physical systems.

Computer Science > Machine Learning arXiv:2602.18679 (cs) [Submitted on 21 Feb 2026] Title:Transformers for dynamical systems learn transfer operators in-context Authors:Anthony Bao, Jeffrey Lai, William Gilpin View a PDF of the paper titled Transformers for dynamical systems learn transfer operators in-context, by Anthony Bao and 2 other authors View PDF HTML (experimental) Abstract:Large-scale foundation models for scientific machine learning adapt to physical settings unseen during training, such as zero-shot transfer between turbulent scales. This phenomenon, in-context learning, challenges conventional understanding of learning and adaptation in physical systems. Here, we study in-context learning of dynamical systems in a minimal setting: we train a small two-layer, single-head transformer to forecast one dynamical system, and then evaluate its ability to forecast a different dynamical system without retraining. We discover an early tradeoff in training between in-distribution and out-of-distribution performance, which manifests as a secondary double descent phenomenon. We discover that attention-based models apply a transfer-operator forecasting strategy in-context. They (1) lift low-dimensional time series using delay embedding, to detect the system's higher-dimensional dynamical manifold, and (2) identify and forecast long-lived invariant sets that characterize the global flow on this manifold. Our results clarify the mechanism enabling large pretrained models to ...

Related Articles

Iran threatens ‘complete and utter annihilation’ of OpenAI's $30B Stargate AI data center in Abu Dhabi — regime posts video with satellite imagery of ChatGPT-maker's premier 1GW data center
Llms

Iran threatens ‘complete and utter annihilation’ of OpenAI's $30B Stargate AI data center in Abu Dhabi — regime posts video with satellite imagery of ChatGPT-maker's premier 1GW data center

Iran's Islamic Revolutionary Guard Corps (IRGC) issued this specific threat in a video update.

AI Tools & Products · 5 min ·
AI Desktop 98 lets you chat with Claude, ChatGPT, and Gemini through a Windows 98-inspired interface
Llms

AI Desktop 98 lets you chat with Claude, ChatGPT, and Gemini through a Windows 98-inspired interface

AI Tools & Products · 3 min ·
Anthropic Restricts Claude Agent Access Amid AI Automation Boom in Crypto
Llms

Anthropic Restricts Claude Agent Access Amid AI Automation Boom in Crypto

Anthropic cut Claude subscription access for Openclaw on April 4, pushing crypto AI agent users to pay-as-you-go billing.

AI Tools & Products · 7 min ·
I hit Claude’s new usage limits — and It changed how I use AI forever
Llms

I hit Claude’s new usage limits — and It changed how I use AI forever

Claude's message limits are dynamic, meaning they change based on site demand which is why I recommend using "Mega-Prompts" and utilizing...

AI Tools & Products · 8 min ·
More in Llms: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime