SOTAVerified

Dirichlet Mechanism for Differentially Private KL Divergence Minimization

2021-10-03NeurIPS 2021Code Available0· sign in to hype

Donlapark Ponnoprat

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Given an empirical distribution f(x) of sensitive data x, we consider the task of minimizing F(y) = D_KL (f(x) y) over a probability simplex, while protecting the privacy of x. We observe that, if we take the exponential mechanism and use the KL divergence as the loss function, then the resulting algorithm is the Dirichlet mechanism that outputs a single draw from a Dirichlet distribution. Motivated by this, we propose a R\'enyi differentially private (RDP) algorithm that employs the Dirichlet mechanism to solve the KL divergence minimization task. In addition, given f(x) as above and y an output of the Dirichlet mechanism, we prove a probability tail bound on D_KL (f(x) y), which is then used to derive a lower bound for the sample complexity of our RDP algorithm. Experiments on real-world datasets demonstrate advantages of our algorithm over Gaussian and Laplace mechanisms in supervised classification and maximum likelihood estimation.

Tasks

Reproductions