NAM: Normalization-based Attention Module
2021-11-24NeurIPS Workshop ImageNet_PPF 2021Code Available1· sign in to hype
Yichao Liu, Zongru Shao, Yueyang Teng, Nico Hoffmann
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/christian-lyc/namOfficialIn paperpytorch★ 157
Abstract
Recognizing less salient features is the key for model compression. However, it has not been investigated in the revolutionary attention mechanisms. In this work, we propose a novel normalization-based attention module (NAM), which suppresses less salient weights. It applies a weight sparsity penalty to the attention modules, thus, making them more computational efficient while retaining similar performance. A comparison with three other attention mechanisms on both Resnet and Mobilenet indicates that our method results in higher accuracy. Code for this paper can be publicly accessed at https://github.com/Christian-lyc/NAM.