Text Understanding with the Attention Sum Reader Network
Rudolf Kadlec, Martin Schmid, Ondrej Bajgar, Jan Kleindienst
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/rkadlec/asreaderOfficialIn papernone★ 0
- github.com/libertatis/mrc-cbttf★ 0
Abstract
Several large cloze-style context-question-answer datasets have been introduced recently: the CNN and Daily Mail news data and the Children's Book Test. Thanks to the size of these datasets, the associated text comprehension task is well suited for deep-learning techniques that currently seem to outperform all alternative approaches. We present a new, simple model that uses attention to directly pick the answer from the context as opposed to computing the answer using a blended representation of words in the document as is usual in similar models. This makes the model particularly suitable for question-answering problems where the answer is a single word from the document. Ensemble of our models sets new state of the art on all evaluated datasets.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| SearchQA | ASR | N-gram F1 | 22.8 | — | Unverified |