SOTAVerified

Robust Distribution Learning with Local and Global Adversarial Corruptions

2024-06-10Unverified0· sign in to hype

Sloan Nietert, Ziv Goldfeld, Soroosh Shafiee

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We consider learning in an adversarial environment, where an -fraction of samples from a distribution P are arbitrarily modified (global corruptions) and the remaining perturbations have average magnitude bounded by (local corruptions). Given access to n such corrupted samples, we seek a computationally efficient estimator P_n that minimizes the Wasserstein distance W_1(P_n,P). In fact, we attack the fine-grained task of minimizing W_1(_\# P_n, _\# P) for all orthogonal projections R^d d, with performance scaling with rank() = k. This allows us to account simultaneously for mean estimation (k=1), distribution estimation (k=d), as well as the settings interpolating between these two extremes. We characterize the optimal population-limit risk for this task and then develop an efficient finite-sample algorithm with error bounded by k + + O(dkn^-1/(k 2)) when P has bounded covariance. This guarantee holds uniformly in k and is minimax optimal up to the sub-optimality of the plug-in estimator when = = 0. Our efficient procedure relies on a novel trace norm approximation of an ideal yet intractable 2-Wasserstein projection estimator. We apply this algorithm to robust stochastic optimization, and, in the process, uncover a new method for overcoming the curse of dimensionality in Wasserstein distributionally robust optimization.

Tasks

Reproductions