SOTAVerified

Monotonic Risk Relationships under Distribution Shifts for Regularized Risk Minimization

2022-10-20Code Available0· sign in to hype

Daniel LeJeune, Jiayu Liu, Reinhard Heckel

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Machine learning systems are often applied to data that is drawn from a different distribution than the training distribution. Recent work has shown that for a variety of classification and signal reconstruction problems, the out-of-distribution performance is strongly linearly correlated with the in-distribution performance. If this relationship or more generally a monotonic one holds, it has important consequences. For example, it allows to optimize performance on one distribution as a proxy for performance on the other. In this paper, we study conditions under which a monotonic relationship between the performances of a model on two distributions is expected. We prove an exact asymptotic linear relation for squared error and a monotonic relation for misclassification error for ridge-regularized general linear models under covariate shift, as well as an approximate linear relation for linear inverse problems.

Tasks

Reproductions