[2602.15004] PDE foundation models are skillful AI weather emulators for the Martian atmosphere
Summary
This article presents a novel approach using AI foundation models to predict weather patterns in the Martian atmosphere, demonstrating significant performance improvements through advanced training techniques.
Why It Matters
Understanding Martian weather is crucial for future exploration and potential colonization. This research highlights how AI can enhance predictive modeling in environments with limited data, paving the way for more effective planning and resource management in space missions.
Key Takeaways
- AI foundation models can effectively emulate Martian weather patterns.
- The study achieved a 34.4% performance increase using advanced training methods.
- The approach is beneficial for real-world problems with sparse data.
Computer Science > Machine Learning arXiv:2602.15004 (cs) [Submitted on 16 Feb 2026] Title:PDE foundation models are skillful AI weather emulators for the Martian atmosphere Authors:Johannes Schmude, Sujit Roy, Liping Wang, Theodore van Kessel, Levente Klein, Marcus Freitag, Eloisa Bentivegna, Robert Manson-Sawko, Bjorn Lutjens, Manil Maskey, Campbell Watson, Rahul Ramachandran, Juan Bernabe-Moreno View a PDF of the paper titled PDE foundation models are skillful AI weather emulators for the Martian atmosphere, by Johannes Schmude and 12 other authors View PDF HTML (experimental) Abstract:We show that AI foundation models that are pretrained on numerical solutions to a diverse corpus of partial differential equations can be adapted and fine-tuned to obtain skillful predictive weather emulators for the Martian atmosphere. We base our work on the Poseidon PDE foundation model for two-dimensional systems. We develop a method to extend Poseidon from two to three dimensions while keeping the pretraining information. Moreover, we investigate the performance of the model in the presence of sparse initial conditions. Our results make use of four Martian years (approx.~34 GB) of training data and a median compute budget of 13 GPU hours. We find that the combination of pretraining and model extension yields a performance increase of 34.4\% on a held-out year. This shows that PDEs-FMs can not only approximate solutions to (other) PDEs but also anchor models for real-world problems wi...