SOTAVerified

Asynchronous stochastic convex optimization

2015-08-04Code Available0· sign in to hype

John C. Duchi, Sorathan Chaturapruek, Christopher Ré

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We show that asymptotically, completely asynchronous stochastic gradient procedures achieve optimal (even to constant factors) convergence rates for the solution of convex optimization problems under nearly the same conditions required for asymptotic optimality of standard stochastic gradient procedures. Roughly, the noise inherent to the stochastic approximation scheme dominates any noise from asynchrony. We also give empirical evidence demonstrating the strong performance of asynchronous, parallel stochastic optimization schemes, demonstrating that the robustness inherent to stochastic approximation problems allows substantially faster parallel and asynchronous solution methods.

Tasks

Reproductions