[2604.09041] U-Cast: A Surprisingly Simple and Efficient Frontier Probabilistic AI Weather Forecaster
About this article
Abstract page for arXiv paper 2604.09041: U-Cast: A Surprisingly Simple and Efficient Frontier Probabilistic AI Weather Forecaster
Computer Science > Machine Learning arXiv:2604.09041 (cs) [Submitted on 10 Apr 2026] Title:U-Cast: A Surprisingly Simple and Efficient Frontier Probabilistic AI Weather Forecaster Authors:Salva Rühling Cachay, Duncan Watson-Parris, Rose Yu View a PDF of the paper titled U-Cast: A Surprisingly Simple and Efficient Frontier Probabilistic AI Weather Forecaster, by Salva R\"uhling Cachay and 2 other authors View PDF HTML (experimental) Abstract:AI-based weather forecasting now rivals traditional physics-based ensembles, but state-of-the-art (SOTA) models rely on specialized architectures and massive computational budgets, creating a high barrier to entry. We demonstrate that such complexity is unnecessary for frontier performance. We introduce U-Cast, a probabilistic forecaster built on a standard U-Net backbone trained with a simple recipe: deterministic pre-training on Mean Absolute Error followed by short probabilistic fine-tuning on the Continuous Ranked Probability Score (CRPS) using Monte Carlo Dropout for stochasticity. As a result, our model matches or exceeds the probabilistic skill of GenCast and IFS ENS at 1.5$^\circ\$ resolution while reducing training compute by over 10$\times$ compared to leading CRPS-based models and inference latency by over 10$\times$ compared to diffusion-based models. U-Cast trains in under 12 H200 GPU-days and generates a 60-step ensemble forecast in 11 seconds. These results suggest that scalable, general-purpose architectures paired with ...