Online Asynchronous Distributed Regression
2014-07-16Code Available0· sign in to hype
Gérard Biau, Ryad Zenine
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/ryadzenine/dolphinOfficialIn papernone★ 0
Abstract
Distributed computing offers a high degree of flexibility to accommodate modern learning constraints and the ever increasing size of datasets involved in massive data issues. Drawing inspiration from the theory of distributed computation models developed in the context of gradient-type optimization algorithms, we present a consensus-based asynchronous distributed approach for nonparametric online regression and analyze some of its asymptotic properties. Substantial numerical evidence involving up to 28 parallel processors is provided on synthetic datasets to assess the excellent performance of our method, both in terms of computation time and prediction accuracy.