SOTAVerified

Parameter-free Stochastic Optimization of Variationally Coherent Functions

2021-01-30Code Available1· sign in to hype

Francesco Orabona, Dávid Pál

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We design and analyze an algorithm for first-order stochastic optimization of a large class of functions on R^d. In particular, we consider the variationally coherent functions which can be convex or non-convex. The iterates of our algorithm on variationally coherent functions converge almost surely to the global minimizer x^*. Additionally, the very same algorithm with the same hyperparameters, after T iterations guarantees on convex functions that the expected suboptimality gap is bounded by O(\|x^* - x_0\| T^-1/2+) for any >0. It is the first algorithm to achieve both these properties at the same time. Also, the rate for convex functions essentially matches the performance of parameter-free algorithms. Our algorithm is an instance of the Follow The Regularized Leader algorithm with the added twist of using rescaled gradients and time-varying linearithmic regularizers.

Tasks

Reproductions