SOTAVerified

Global linear convergence of Newton's method without strong-convexity or Lipschitz gradients

2018-06-01Unverified0· sign in to hype

Sai Praneeth Karimireddy, Sebastian U. Stich, Martin Jaggi

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We show that Newton's method converges globally at a linear rate for objective functions whose Hessians are stable. This class of problems includes many functions which are not strongly convex, such as logistic regression. Our linear convergence result is (i) affine-invariant, and holds even if an (ii) approximate Hessian is used, and if the subproblems are (iii) only solved approximately. Thus we theoretically demonstrate the superiority of Newton's method over first-order methods, which would only achieve a sublinear O(1/t^2) rate under similar conditions.

Tasks

Reproductions