Syntax-aware Neural Semantic Role Labeling with Supertags
Jungo Kasai, Dan Friedman, Robert Frank, Dragomir Radev, Owen Rambow
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/jungokasai/stagging_srlOfficialIn papertf★ 0
Abstract
We introduce a new syntax-aware model for dependency-based semantic role labeling that outperforms syntax-agnostic models for English and Spanish. We use a BiLSTM to tag the text with supertags extracted from dependency parses, and we feed these supertags, along with words and parts of speech, into a deep highway BiLSTM for semantic role labeling. Our model combines the strengths of earlier models that performed SRL on the basis of a full dependency parse with more recent models that use no syntactic information at all. Our local and non-ensemble model achieves state-of-the-art performance on the CoNLL 09 English and Spanish datasets. SRL models benefit from syntactic information, and we show that supertagging is a simple, powerful, and robust way to incorporate syntax into a neural SRL system.