SOTAVerified

A Distributed Cubic-Regularized Newton Method for Smooth Convex Optimization over Networks

2020-07-07Unverified0· sign in to hype

César A. Uribe, Ali Jadbabaie

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We propose a distributed, cubic-regularized Newton method for large-scale convex optimization over networks. The proposed method requires only local computations and communications and is suitable for federated learning applications over arbitrary network topologies. We show a O(k^-3) convergence rate when the cost function is convex with Lipschitz gradient and Hessian, with k being the number of iterations. We further provide network-dependent bounds for the communication required in each step of the algorithm. We provide numerical experiments that validate our theoretical results.

Tasks

Reproductions