SOTAVerified

Structured Minimally Supervised Learning for Neural Relation Extraction

2019-03-29NAACL 2019Code Available0· sign in to hype

Fan Bai, Alan Ritter

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We present an approach to minimally supervised relation extraction that combines the benefits of learned representations and structured learning, and accurately predicts sentence-level relation mentions given only proposition-level supervision from a KB. By explicitly reasoning about missing data during learning, our approach enables large-scale training of 1D convolutional neural networks while mitigating the issue of label noise inherent in distant supervision. Our approach achieves state-of-the-art results on minimally supervised sentential relation extraction, outperforming a number of baselines, including a competitive approach that uses the attention layer of a purely neural model.

Tasks

Reproductions