Improved Complexities for Stochastic Conditional Gradient Methods under Interpolation-like Conditions
2020-06-15Unverified0· sign in to hype
Tesi Xiao, Krishnakumar Balasubramanian, Saeed Ghadimi
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
We analyze stochastic conditional gradient methods for constrained optimization problems arising in over-parametrized machine learning. We show that one could leverage the interpolation-like conditions satisfied by such models to obtain improved oracle complexities. Specifically, when the objective function is convex, we show that the conditional gradient method requires O(^-2) calls to the stochastic gradient oracle to find an -optimal solution. Furthermore, by including a gradient sliding step, we show that the number of calls reduces to O(^-1.5).