SING: A Plug-and-Play DNN Learning Technique
Adrien Courtois, Damien Scieur, Jean-Michel Morel, Pablo Arias, Thomas Eboli
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/adriencourtois/singOfficialIn paperpytorch★ 8
- github.com/pwc-1/Paper-9/tree/main/4/SINGmindspore★ 0
Abstract
We propose SING (StabIlized and Normalized Gradient), a plug-and-play technique that improves the stability and generalization of the Adam(W) optimizer. SING is straightforward to implement and has minimal computational overhead, requiring only a layer-wise standardization of the gradients fed to Adam(W) without introducing additional hyper-parameters. We support the effectiveness and practicality of the proposed approach by showing improved results on a wide range of architectures, problems (such as image classification, depth estimation, and natural language processing), and in combination with other optimizers. We provide a theoretical analysis of the convergence of the method, and we show that by virtue of the standardization, SING can escape local minima narrower than a threshold that is inversely proportional to the network's depth.