SOTAVerified

Efficient Distributed Learning with Sparsity

2016-05-25ICML 2017Unverified0· sign in to hype

Jialei Wang, Mladen Kolar, Nathan Srebro, Tong Zhang

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We propose a novel, efficient approach for distributed sparse learning in high-dimensions, where observations are randomly partitioned across machines. Computationally, at each round our method only requires the master machine to solve a shifted ell_1 regularized M-estimation problem, and other workers to compute the gradient. In respect of communication, the proposed approach provably matches the estimation error bound of centralized methods within constant rounds of communications (ignoring logarithmic factors). We conduct extensive experiments on both simulated and real world datasets, and demonstrate encouraging performances on high-dimensional regression and classification tasks.

Tasks

Reproductions