SOTAVerified

Stochastic Variance-Reduced Newton: Accelerating Finite-Sum Minimization with Large Batches

2022-06-06Code Available0· sign in to hype

Michał Dereziński

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Stochastic variance reduction has proven effective at accelerating first-order algorithms for solving convex finite-sum optimization tasks such as empirical risk minimization. Incorporating second-order information has proven helpful in further improving the performance of these first-order methods. Yet, comparatively little is known about the benefits of using variance reduction to accelerate popular stochastic second-order methods such as Subsampled Newton. To address this, we propose Stochastic Variance-Reduced Newton (SVRN), a finite-sum minimization algorithm that provably accelerates existing stochastic Newton methods from O((1/)) to O((1/)(n)) passes over the data, i.e., by a factor of O((n)), where n is the number of sum components and is the approximation factor in the Hessian estimate. Surprisingly, this acceleration gets more significant the larger the data size n, which is a unique property of SVRN. Our algorithm retains the key advantages of Newton-type methods, such as easily parallelizable large-batch operations and a simple unit step size. We use SVRN to accelerate Subsampled Newton and Iterative Hessian Sketch algorithms, and show that it compares favorably to popular first-order methods with variance~reduction.

Tasks

Reproductions