On a Conjecture Regarding the Adam Optimizer
2021-11-16Unverified0· sign in to hype
Mohamed Akrout, Douglas Tweed
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Why does the Adam optimizer work so well in deep-learning applications? Adam's originators, Kingma and Ba, presented a mathematical argument that was meant to help explain its success, but Bock and colleagues have since reported that a key piece is missing from that argument - an unproven lemma which we will call Bock's conjecture. Here we show that this conjecture is false, but we prove a modified version of it - a generalization of a result of Reddi and colleagues - which can take its place in analyses of Adam.