Words or Characters? Fine-grained Gating for Reading Comprehension
Zhilin Yang, Bhuwan Dhingra, Ye Yuan, Junjie Hu, William W. Cohen, Ruslan Salakhutdinov
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/kimiyoung/fg-gatingOfficialIn papernone★ 19
Abstract
Previous work combines word-level and character-level representations using concatenation or scalar weighting, which is suboptimal for high-level tasks like reading comprehension. We present a fine-grained gating mechanism to dynamically combine word-level and character-level representations based on properties of the words. We also extend the idea of fine-grained gating to modeling the interaction between questions and paragraphs for reading comprehension. Experiments show that our approach can improve the performance on reading comprehension tasks, achieving new state-of-the-art results on the Children's Book Test dataset. To demonstrate the generality of our gating mechanism, we also show improved results on a social media tag prediction task.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| SQuAD1.1 | Fine-Grained Gating | EM | 62.45 | — | Unverified |
| SQuAD1.1 dev | FG fine-grained gate | EM | 59.95 | — | Unverified |