AdaFamily: A family of Adam-like adaptive gradient methods
2022-03-03Unverified0· sign in to hype
Hannes Fassold
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
We propose AdaFamily, a novel method for training deep neural networks. It is a family of adaptive gradient methods and can be interpreted as sort of a blend of the optimization algorithms Adam, AdaBelief and AdaMomentum. We perform experiments on standard datasets for image classification, demonstrating that our proposed method outperforms these algorithms.