SOTAVerified

Conjugate-gradient-based Adam for stochastic optimization and its application to deep learning

2020-02-29Unverified0· sign in to hype

Yu Kobayashi, Hideaki Iiduka

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This paper proposes a conjugate-gradient-based Adam algorithm blending Adam with nonlinear conjugate gradient methods and shows its convergence analysis. Numerical experiments on text classification and image classification show that the proposed algorithm can train deep neural network models in fewer epochs than the existing adaptive stochastic optimization algorithms can.

Tasks

Reproductions