SOTAVerified

Attend and Rectify: a Gated Attention Mechanism for Fine-Grained Recovery

2018-07-19ECCV 2018Code Available0· sign in to hype

Pau Rodríguez, Josep M. Gonfaus, Guillem Cucurull, F. Xavier Roca, Jordi Gonzàlez

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We propose a novel attention mechanism to enhance Convolutional Neural Networks for fine-grained recognition. It learns to attend to lower-level feature activations without requiring part annotations and uses these activations to update and rectify the output likelihood distribution. In contrast to other approaches, the proposed mechanism is modular, architecture-independent and efficient both in terms of parameters and computation required. Experiments show that networks augmented with our approach systematically improve their classification accuracy and become more robust to clutter. As a result, Wide Residual Networks augmented with our proposal surpasses the state of the art classification accuracies in CIFAR-10, the Adience gender recognition task, Stanford dogs, and UEC Food-100.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
CIFAR-100WARNPercentage correct82.18Unverified

Reproductions