Accelerated Proximal Stochastic Dual Coordinate Ascent for Regularized Loss Minimization
2013-09-10Unverified0· sign in to hype
Shai Shalev-Shwartz, Tong Zhang
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
We introduce a proximal version of the stochastic dual coordinate ascent method and show how to accelerate the method using an inner-outer iteration procedure. We analyze the runtime of the framework and obtain rates that improve state-of-the-art results for various key machine learning optimization problems including SVM, logistic regression, ridge regression, Lasso, and multiclass SVM. Experiments validate our theoretical findings.