SOTAVerified

Introduce the Result Into Self-Attention

2021-09-21Unverified0· sign in to hype

Chengcheng Ye

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Traditional self-attention mechanisms in convolutional networks tend to use only the output of the previous layer as input to the attention network, such as SENet, CBAM, etc. In this paper, we propose a new attention modification method that tries to get the output of the classification network in advance and use it as a part of the input of the attention network. We used the auxiliary classifier proposed in GoogLeNet to obtain the results in advance and pass them into attention networks. we added this mechanism to SE-ResNet for our experiments and achieved a classification accuracy improvement of at most 1.94% on cifar100.

Tasks

Reproductions