A Neural Transition-based Model for Nested Mention Recognition
Bailin Wang, Wei Lu, Yu Wang, Hongxia Jin
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/berlino/nest-trans-em18OfficialIn paperpytorch★ 36
Abstract
It is common that entity mentions can contain other mentions recursively. This paper introduces a scalable transition-based method to model the nested structure of mentions. We first map a sentence with nested mentions to a designated forest where each mention corresponds to a constituent of the forest. Our shift-reduce based system then learns to construct the forest structure in a bottom-up manner through an action sequence whose maximal length is guaranteed to be three times of the sentence length. Based on Stack-LSTM which is employed to efficiently and effectively represent the states of the system in a continuous space, our system is further incorporated with a character-based component to capture letter-level patterns. Our model achieves the state-of-the-art results on ACE datasets, showing its effectiveness in detecting nested mentions.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| ACE 2004 | Neural transition-based model | F1 | 73.1 | — | Unverified |
| ACE 2004 | Neural transition-based model | F1 | 73.3 | — | Unverified |
| ACE 2005 | Neural transition-based model | F1 | 73 | — | Unverified |
| GENIA | Neural transition-based model | F1 | 73.9 | — | Unverified |