SOTAVerified

Ruminating Reader: Reasoning with Gated Multi-Hop Attention

2017-04-24WS 2018Unverified0· sign in to hype

Yichen Gong, Samuel R. Bowman

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

To answer the question in machine comprehension (MC) task, the models need to establish the interaction between the question and the context. To tackle the problem that the single-pass model cannot reflect on and correct its answer, we present Ruminating Reader. Ruminating Reader adds a second pass of attention and a novel information fusion component to the Bi-Directional Attention Flow model (BiDAF). We propose novel layer structures that construct an query-aware context vector representation and fuse encoding representation with intermediate representation on top of BiDAF model. We show that a multi-hop attention mechanism can be applied to a bi-directional attention structure. In experiments on SQuAD, we find that the Reader outperforms the BiDAF baseline by a substantial margin, and matches or surpasses the performance of all other published systems.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
SQuAD1.1Ruminating Reader (single model)EM70.64Unverified
SQuAD1.1 devRuminating ReaderEM70.6Unverified

Reproductions