SOTAVerified

Stochastic Conditional Gradient++

2019-02-19Unverified0· sign in to hype

Hamed Hassani, Amin Karbasi, Aryan Mokhtari, Zebang Shen

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In this paper, we consider the general non-oblivious stochastic optimization where the underlying stochasticity may change during the optimization procedure and depends on the point at which the function is evaluated. We develop Stochastic Frank-Wolfe++ (SFW++ ), an efficient variant of the conditional gradient method for minimizing a smooth non-convex function subject to a convex body constraint. We show that SFW++ converges to an -first order stationary point by using O(1/^3) stochastic gradients. Once further structures are present, SFW++'s theoretical guarantees, in terms of the convergence rate and quality of its solution, improve. In particular, for minimizing a convex function, SFW++ achieves an -approximate optimum while using O(1/^2) stochastic gradients. It is known that this rate is optimal in terms of stochastic gradient evaluations. Similarly, for maximizing a monotone continuous DR-submodular function, a slightly different form of SFW++ , called Stochastic Continuous Greedy++ (SCG++ ), achieves a tight [(1-1/e)OPT -] solution while using O(1/^2) stochastic gradients. Through an information theoretic argument, we also prove that SCG++ 's convergence rate is optimal. Finally, for maximizing a non-monotone continuous DR-submodular function, we can achieve a [(1/e)OPT -] solution by using O(1/^2) stochastic gradients. We should highlight that our results and our novel variance reduction technique trivially extend to the standard and easier oblivious stochastic optimization settings for (non-)covex and continuous submodular settings.

Tasks

Reproductions