Simple BERT Models for Relation Extraction and Semantic Role Labeling
2019-04-10Code Available0· sign in to hype
Peng Shi, Jimmy Lin
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/Impavidity/relogicOfficialpytorch★ 0
- github.com/cogcomp/srl-englishpytorch★ 0
- github.com/riccorl/transformer-srlpytorch★ 0
Abstract
We present simple BERT-based models for relation extraction and semantic role labeling. In recent years, state-of-the-art performance has been achieved using neural models by incorporating lexical and syntactic features such as part-of-speech tags and dependency trees. In this paper, extensive experiments on datasets for these two tasks show that without using any external features, a simple BERT-based model can achieve state-of-the-art performance. To our knowledge, we are the first to successfully apply BERT in this manner. Our models provide strong baselines for future research.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| TACRED | BERT-LSTM-base | F1 | 67.8 | — | Unverified |