SOTAVerified

On the Trend-corrected Variant of Adaptive Stochastic Optimization Methods

2020-01-17Unverified0· sign in to hype

Bingxin Zhou, Xuebin Zheng, Junbin Gao

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Adam-type optimizers, as a class of adaptive moment estimation methods with the exponential moving average scheme, have been successfully used in many applications of deep learning. Such methods are appealing due to the capability on large-scale sparse datasets with high computational efficiency. In this paper, we present a new framework for Adam-type methods with the trend information when updating the parameters with the adaptive step size and gradients. The additional terms in the algorithm promise an efficient movement on the complex cost surface, and thus the loss would converge more rapidly. We show empirically the importance of adding the trend component, where our framework outperforms the conventional Adam and AMSGrad methods constantly on the classical models with several real-world datasets.

Tasks

Reproductions