SOTAVerified

S-BDT: Distributed Differentially Private Boosted Decision Trees

2023-09-21Code Available0· sign in to hype

Thorsten Peinemann, Moritz Kirschte, Joshua Stock, Carlos Cotrini, Esfandiar Mohammadi

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We introduce S-BDT: a novel (,)-differentially private distributed gradient boosted decision tree (GBDT) learner that improves the protection of single training data points (privacy) while achieving meaningful learning goals, such as accuracy or regression error (utility). S-BDT uses less noise by relying on non-spherical multivariate Gaussian noise, for which we show tight subsampling bounds for privacy amplification and incorporate that into a R\'enyi filter for individual privacy accounting. We experimentally reach the same utility while saving 50\% in terms of epsilon for 0.5 on the Abalone regression dataset (dataset size 4K), saving 30\% in terms of epsilon for 0.08 for the Adult classification dataset (dataset size 50K), and saving 30\% in terms of epsilon for 0.03 for the Spambase classification dataset (dataset size 5K). Moreover, we show that for situations where a GBDT is learning a stream of data that originates from different subpopulations (non-IID), S-BDT improves the saving of epsilon even further.

Tasks

Reproductions