SOTAVerified

An Adaptive Method Stabilizing Activations for Enhanced Generalization

2025-06-10Code Available0· sign in to hype

Hyunseok Seung, Jaewoo Lee, Hyunsuk Ko

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We introduce AdaAct, a novel optimization algorithm that adjusts learning rates according to activation variance. Our method enhances the stability of neuron outputs by incorporating neuron-wise adaptivity during the training process, which subsequently leads to better generalization -- a complementary approach to conventional activation regularization methods. Experimental results demonstrate AdaAct's competitive performance across standard image classification benchmarks. We evaluate AdaAct on CIFAR and ImageNet, comparing it with other state-of-the-art methods. Importantly, AdaAct effectively bridges the gap between the convergence speed of Adam and the strong generalization capabilities of SGD, all while maintaining competitive execution times. Code is available at https://github.com/hseung88/adaact.

Tasks

Reproductions