SOTAVerified

Variance Suppression: Balanced Training Process in Deep Learning

2018-11-20Unverified0· sign in to hype

Tao Yi, Xingxuan Wang

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Stochastic gradient descent updates parameters with summation gradient computed from a random data batch. This summation will lead to unbalanced training process if the data we obtained is unbalanced. To address this issue, this paper takes the error variance and error mean both into consideration. The adaptively adjusting approach of two terms trading off is also given in our algorithm. Due to this algorithm can suppress error variance, we named it Variance Suppression Gradient Descent (VSSGD). Experimental results have demonstrated that VSSGD can accelerate the training process, effectively prevent overfitting, improve the networks learning capacity from small samples.

Tasks

Reproductions