SOTAVerified

Attention-over-Attention Neural Networks for Reading Comprehension

2016-07-15ACL 2017Code Available0· sign in to hype

Yiming Cui, Zhipeng Chen, Si Wei, Shijin Wang, Ting Liu, Guoping Hu

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Cloze-style queries are representative problems in reading comprehension. Over the past few months, we have seen much progress that utilizing neural network approach to solve Cloze-style questions. In this paper, we present a novel model called attention-over-attention reader for the Cloze-style reading comprehension task. Our model aims to place another attention mechanism over the document-level attention, and induces "attended attention" for final predictions. Unlike the previous works, our neural network model requires less pre-defined hyper-parameters and uses an elegant architecture for modeling. Experimental results show that the proposed attention-over-attention model significantly outperforms various state-of-the-art systems by a large margin in public datasets, such as CNN and Children's Book Test datasets.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
Children's Book TestAoA readerAccuracy-CN69.4Unverified
CNN / Daily MailAoA ReaderCNN74.4Unverified

Reproductions