SVGD: A Virtual Gradients Descent Method for Stochastic Optimization
2019-07-09Code Available0· sign in to hype
Zheng Li, Shi Shu
Code Available — Be the first to reproduce this paper.
ReproduceCode
Abstract
Inspired by dynamic programming, we propose Stochastic Virtual Gradient Descent (SVGD) algorithm where the Virtual Gradient is defined by computational graph and automatic differentiation. The method is computationally efficient and has little memory requirements. We also analyze the theoretical convergence properties and implementation of the algorithm. Experimental results on multiple datasets and network models show that SVGD has advantages over other stochastic optimization methods.