SOTAVerified

Neural Relation Extraction via Inner-Sentence Noise Reduction and Transfer Learning

2018-08-21EMNLP 2018Unverified0· sign in to hype

Tianyi Liu, Xinsong Zhang, Wanhao Zhou, Weijia Jia

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Extracting relations is critical for knowledge base completion and construction in which distant supervised methods are widely used to extract relational facts automatically with the existing knowledge bases. However, the automatically constructed datasets comprise amounts of low-quality sentences containing noisy words, which is neglected by current distant supervised methods resulting in unacceptable precisions. To mitigate this problem, we propose a novel word-level distant supervised approach for relation extraction. We first build Sub-Tree Parse(STP) to remove noisy words that are irrelevant to relations. Then we construct a neural network inputting the sub-tree while applying the entity-wise attention to identify the important semantic features of relational words in each instance. To make our model more robust against noisy words, we initialize our network with a priori knowledge learned from the relevant task of entity classification by transfer learning. We conduct extensive experiments using the corpora of New York Times(NYT) and Freebase. Experiments show that our approach is effective and improves the area of Precision/Recall(PR) from 0.35 to 0.39 over the state-of-the-art work.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
New York Times CorpusBiGRU+WLA+EWAAUC0.39Unverified
New York Times CorpusBGRU-SETAUC0.39Unverified

Reproductions