SOTAVerified

Provably Calibrated Regression Under Distribution Drift

2021-09-29Unverified0· sign in to hype

Shengjia Zhao, Yusuke Tashiro, Danny Tse, Stefano Ermon

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Accurate uncertainty quantification is a key building block of trustworthy machine learning systems. Uncertainty is typically represented by probability distributions over the possible outcomes, and these probabilities should be calibrated, e.g. the 90\% credible interval should contain the true outcome 90\% of the times. In the online prediction setup, existing conformal methods can provably achieve calibration assuming no distribution shift; however, the assumption is difficult to verify, and unlikely to hold in many applications such as time series prediction. Inspired by control theory, we propose a prediction algorithm that guarantees calibration even under distribution shift, and achieves strong performance on metrics such as sharpness and proper scores. We compare our method with baselines on 19 time-series and regression datasets, and our method achieves approximately 2x reduction in calibration error, comparable sharpness, and improved downstream decision utility.

Tasks

Reproductions