SOTAVerified

Densely Connected Attention Propagation for Reading Comprehension

2018-11-10NeurIPS 2018Code Available1· sign in to hype

Yi Tay, Luu Anh Tuan, Siu Cheung Hui, Jian Su

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We propose DecaProp (Densely Connected Attention Propagation), a new densely connected neural architecture for reading comprehension (RC). There are two distinct characteristics of our model. Firstly, our model densely connects all pairwise layers of the network, modeling relationships between passage and query across all hierarchical levels. Secondly, the dense connectors in our network are learned via attention instead of standard residual skip-connectors. To this end, we propose novel Bidirectional Attention Connectors (BAC) for efficiently forging connections throughout the network. We conduct extensive experiments on four challenging RC benchmarks. Our proposed approach achieves state-of-the-art results on all four, outperforming existing baselines by up to 2.6\%-14.2\% in absolute F1 score.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
QUASARDecaPropEM (Quasar-T)38.6Unverified
SearchQADECAPROPEM62.2Unverified
SearchQADecaPropEM56.8Unverified

Reproductions