SOTAVerified

RepAn: Enhanced Annealing through Re-parameterization

2024-01-01CVPR 2024Code Available0· sign in to hype

Xiang Fei, Xiawu Zheng, Yan Wang, Fei Chao, Chenglin Wu, Liujuan Cao

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

The simulated annealing algorithm aims to improve model convergence through multiple restarts of training. However existing annealing algorithms overlook the correlation between different cycles neglecting the potential for incremental learning. We contend that a fixed network structure prevents the model from recognizing distinct features at different training stages. To this end we propose RepAn redesigning the irreversible re-parameterization (Rep) method and integrating it with annealing to enhance training. Specifically the network goes through Rep expansion restoration and backpropagation operations during training and iterating through these processes in each annealing round. Such a method exhibits good generalization and is easy to apply and we provide theoretical explanations for its effectiveness. Experiments demonstrate that our method improves baseline performance by 6.38% on the CIFAR-100 dataset and 2.80% on ImageNet achieving state-of-the-art performance in the Rep field. The code is available at https://github.com/xfey/RepAn.

Tasks

Reproductions