SOTAVerified

Lightweight, Pre-trained Transformers for Remote Sensing Timeseries

2023-04-27Code Available2· sign in to hype

Gabriel Tseng, Ruben Cartuyvels, Ivan Zvonkov, Mirali Purohit, David Rolnick, Hannah Kerner

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Machine learning methods for satellite data have a range of societally relevant applications, but labels used to train models can be difficult or impossible to acquire. Self-supervision is a natural solution in settings with limited labeled data, but current self-supervised models for satellite data fail to take advantage of the characteristics of that data, including the temporal dimension (which is critical for many applications, such as monitoring crop growth) and availability of data from many complementary sensors (which can significantly improve a model's predictive performance). We present Presto (the Pretrained Remote Sensing Transformer), a model pre-trained on remote sensing pixel-timeseries data. By designing Presto specifically for remote sensing data, we can create a significantly smaller but performant model. Presto excels at a wide variety of globally distributed remote sensing tasks and performs competitively with much larger models while requiring far less compute. Presto can be used for transfer learning or as a feature extractor for simple models, enabling efficient deployment at scale.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
CropHarvest - BrazilPrestoRTarget Binary F10.89Unverified
CropHarvest - KenyaPrestoR - no DWTarget Binary F10.86Unverified
CropHarvest - TogoPrestoRTarget Binary F10.8Unverified

Reproductions