SOTAVerified

Graph Convolution over Pruned Dependency Trees Improves Relation Extraction

2018-09-26EMNLP 2018Code Available0· sign in to hype

Yuhao Zhang, Peng Qi, Christopher D. Manning

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Dependency trees help relation extraction models capture long-range relations between words. However, existing dependency-based models either neglect crucial information (e.g., negation) by pruning the dependency trees too aggressively, or are computationally inefficient because it is difficult to parallelize over different tree structures. We propose an extension of graph convolutional networks that is tailored for relation extraction, which pools information over arbitrary dependency structures efficiently in parallel. To incorporate relevant information while maximally removing irrelevant content, we further apply a novel pruning strategy to the input trees by keeping words immediately around the shortest path between the two entities among which a relation might hold. The resulting model achieves state-of-the-art performance on the large-scale TACRED dataset, outperforming existing sequence and dependency-based neural models. We also show through detailed analysis that this model has complementary strengths to sequence models, and combining them further improves the state of the art.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
Re-TACREDC-GCNF180.3Unverified
TACREDC-GCN + PA-LSTMF168.2Unverified
TACREDGCN + PA-LSTMF167.1Unverified
TACREDC-GCNF166.4Unverified
TACREDGCNF164Unverified

Reproductions