SOTAVerified

Differentially Private Nonparametric Regression Under a Growth Condition

2021-11-24Unverified0· sign in to hype

Noah Golowich

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Given a real-valued hypothesis class H, we investigate under what conditions there is a differentially private algorithm which learns an optimal hypothesis from H given i.i.d. data. Inspired by recent results for the related setting of binary classification (Alon et al., 2019; Bun et al., 2020), where it was shown that online learnability of a binary class is necessary and sufficient for its private learnability, Jung et al. (2020) showed that in the setting of regression, online learnability of H is necessary for private learnability. Here online learnability of H is characterized by the finiteness of its -sequential fat shattering dimension, sfat_(H), for all > 0. In terms of sufficient conditions for private learnability, Jung et al. (2020) showed that H is privately learnable if _ 0 sfat_(H) is finite, which is a fairly restrictive condition. We show that under the relaxed condition _ 0 sfat_(H) = 0, H is privately learnable, establishing the first nonparametric private learnability guarantee for classes H with sfat_(H) diverging as 0. Our techniques involve a novel filtering procedure to output stable hypotheses for nonparametric function classes.

Tasks

Reproductions