A Space-Time Transformer for Precipitation Nowcasting
Levi Harris, Tianlong Chen
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/leharris3/w4c-25OfficialIn paper★ 7
- github.com/leharris3/satformerOfficialIn paper★ 7
Abstract
Meteorological agencies around the world rely on real-time flood guidance to issue life-saving advisories and warnings. For decades traditional numerical weather prediction (NWP) models have been state-of-the-art for precipitation forecasting. However, physically-parameterized models suffer from a few core limitations: first, solving PDEs to resolve atmospheric dynamics is computationally demanding, and second, these methods degrade in performance at nowcasting timescales (i.e., 0-4 hour lead-times). Motivated by these shortcomings, recent work proposes AI-weather prediction (AI-WP) alternatives that learn to emulate analysis data with neural networks. While these data-driven approaches have enjoyed enormous success across diverse spatial and temporal resolutions, applications of video-understanding architectures for weather forecasting remain underexplored. To address these gaps, we propose SaTformer: a video transformer built on full space-time attention that skillfully forecasts extreme precipitation from satellite radiances. Along with our novel architecture, we introduce techniques to tame long-tailed precipitation datasets. Namely, we reformulate precipitation regression into a classification problem, and employ a class-weighted loss to address label imbalances. Our model scored first place on the NeurIPS Weather4Cast 2025 ``Cumulative Rainfall'' challenge. Code and model weights are available: github.com/leharris3/satformer