SOTAVerified

Decentralized learning with budgeted network load using Gaussian copulas and classifier ensembles

2018-04-26Code Available0· sign in to hype

John Klein, Mahmoud Albardan, Benjamin Guedj, Olivier Colot

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We examine a network of learners which address the same classification task but must learn from different data sets. The learners cannot share data but instead share their models. Models are shared only one time so as to preserve the network load. We introduce DELCO (standing for Decentralized Ensemble Learning with COpulas), a new approach allowing to aggregate the predictions of the classifiers trained by each learner. The proposed method aggregates the base classifiers using a probabilistic model relying on Gaussian copulas. Experiments on logistic regressor ensembles demonstrate competing accuracy and increased robustness in case of dependent classifiers. A companion python implementation can be downloaded at https://github.com/john-klein/DELCO

Tasks

Reproductions