SOTAVerified

A Neural Transition-based Model for Nested Mention Recognition

2018-10-03EMNLP 2018Code Available1· sign in to hype

Bailin Wang, Wei Lu, Yu Wang, Hongxia Jin

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

It is common that entity mentions can contain other mentions recursively. This paper introduces a scalable transition-based method to model the nested structure of mentions. We first map a sentence with nested mentions to a designated forest where each mention corresponds to a constituent of the forest. Our shift-reduce based system then learns to construct the forest structure in a bottom-up manner through an action sequence whose maximal length is guaranteed to be three times of the sentence length. Based on Stack-LSTM which is employed to efficiently and effectively represent the states of the system in a continuous space, our system is further incorporated with a character-based component to capture letter-level patterns. Our model achieves the state-of-the-art results on ACE datasets, showing its effectiveness in detecting nested mentions.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
ACE 2004Neural transition-based modelF173.1Unverified
ACE 2004Neural transition-based modelF173.3Unverified
ACE 2005Neural transition-based modelF173Unverified
GENIANeural transition-based modelF173.9Unverified

Reproductions