Effective Benchmarks for Optical Turbulence Modeling
Christopher Jellen, Charles Nelson, Cody Brownell, John Burkhardt
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/cdjellen/otbenchOfficialIn papernone★ 6
Abstract
Optical turbulence presents a significant challenge for communication, directed energy, and imaging systems, especially in the atmospheric boundary layer. Effective modeling of optical turbulence strength is critical for the development and deployment of these systems. The lack of standard evaluation tools, especially long-term data sets, modeling tasks, metrics, and baseline models, prevent effective comparisons between approaches and models. This reduces the ease of reproducing results and contributes to over-fitting on local micro-climates. Performance characterized using evaluation metrics provides some insight into the applicability of a model for predicting the strength of optical turbulence. However, these metrics are not sufficient for understanding the relative quality of a model. We introduce the otbench package, a Python package for rigorous development and evaluation of optical turbulence strength prediction models. The package provides a consistent interface for evaluating optical turbulence models on a variety of benchmark tasks and data sets. The otbench package includes a range of baseline models, including statistical, data-driven, and deep learning models, to provide a sense of relative model quality. otbench also provides support for adding new data sets, tasks, and evaluation metrics. The package is available at https://github.com/cdjellen/otbench.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| MLO-Cn2 | Climatology | RMSE | 0.66 | — | Unverified |
| MLO-Cn2 | GBRT | RMSE | 0.21 | — | Unverified |
| MLO-Cn2 | Minute Climatology | RMSE | 0.5 | — | Unverified |
| MLO-Cn2 | Persistence | RMSE | 1.21 | — | Unverified |
| MLO-Cn2 | RNN | RMSE | 0.34 | — | Unverified |
| MLO-Cn2 | GBRT | RMSE | 0.43 | — | Unverified |
| MLO-Cn2 | Mean Window Forecast | RMSE | 0.48 | — | Unverified |
| MLO-Cn2 | Minute Climatology | RMSE | 0.55 | — | Unverified |
| MLO-Cn2 | RNN | RMSE | 0.58 | — | Unverified |
| MLO-Cn2 | Linear Forecast | RMSE | 0.93 | — | Unverified |
| MLO-Cn2 | Persistence | RMSE | 1.23 | — | Unverified |
| USNA-Cn2 (short-duration) | Persistence | RMSE | 0.82 | — | Unverified |
| USNA-Cn2 (short-duration) | GBRT | RMSE | 0.16 | — | Unverified |
| USNA-Cn2 (short-duration) | Minute Climatology | RMSE | 0.45 | — | Unverified |
| USNA-Cn2 (short-duration) | Persistence | RMSE | 0.76 | — | Unverified |
| USNA-Cn2 (short-duration) | RNN | RMSE | 0.19 | — | Unverified |
| USNA-Cn2 (short-duration) | Mean Window Forecast | RMSE | 0.18 | — | Unverified |
| USNA-Cn2 (short-duration) | Minute Climatology | RMSE | 0.45 | — | Unverified |