SOTAVerified

Simple BERT Models for Relation Extraction and Semantic Role Labeling

2019-04-10Code Available0· sign in to hype

Peng Shi, Jimmy Lin

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We present simple BERT-based models for relation extraction and semantic role labeling. In recent years, state-of-the-art performance has been achieved using neural models by incorporating lexical and syntactic features such as part-of-speech tags and dependency trees. In this paper, extensive experiments on datasets for these two tasks show that without using any external features, a simple BERT-based model can achieve state-of-the-art performance. To our knowledge, we are the first to successfully apply BERT in this manner. Our models provide strong baselines for future research.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
TACREDBERT-LSTM-baseF167.8Unverified

Reproductions