SOTAVerified

Differentially Private Selection using Smooth Sensitivity

2025-04-10Code Available0· sign in to hype

Iago Chaves, Victor Farias, Amanda Perez, Diego Parente, Javam Machado

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Differentially private selection mechanisms offer strong privacy guarantees for queries aiming to identify the top-scoring element r from a finite set R, based on a dataset-dependent utility function. While selection queries are fundamental in data science, few mechanisms effectively ensure their privacy. Furthermore, most approaches rely on global sensitivity to achieve differential privacy (DP), which can introduce excessive noise and impair downstream inferences. To address this limitation, we propose the Smooth Noisy Max (SNM) mechanism, which leverages smooth sensitivity to yield provably tighter (upper bounds on) expected errors compared to global sensitivity-based methods. Empirical results demonstrate that SNM is more accurate than state-of-the-art differentially private selection methods in three applications: percentile selection, greedy decision trees, and random forests.

Tasks

Reproductions