SOTAVerified

Distantly Supervised Relation Extraction with Sentence Reconstruction and Knowledge Base Priors

2021-04-16NAACL 2021Unverified0· sign in to hype

Fenia Christopoulou, Makoto Miwa, Sophia Ananiadou

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We propose a multi-task, probabilistic approach to facilitate distantly supervised relation extraction by bringing closer the representations of sentences that contain the same Knowledge Base pairs. To achieve this, we bias the latent space of sentences via a Variational Autoencoder (VAE) that is trained jointly with a relation classifier. The latent code guides the pair representations and influences sentence reconstruction. Experimental results on two datasets created via distant supervision indicate that multi-task learning results in performance benefits. Additional exploration of employing Knowledge Base priors into the VAE reveals that the sentence space can be shifted towards that of the Knowledge Base, offering interpretability and further improving results.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
NYT CorpusDSRE-VAEP@10%75.9Unverified

Reproductions