Span-based Joint Entity and Relation Extraction with Transformer Pre-training
Markus Eberts, Adrian Ulges
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/markus-eberts/spertOfficialIn paperpytorch★ 0
- github.com/yangyucheng000/Paper-3/tree/main/SpanQualifiermindspore★ 0
- github.com/lavis-nlp/spertpytorch★ 0
Abstract
We introduce SpERT, an attention model for span-based joint entity and relation extraction. Our key contribution is a light-weight reasoning on BERT embeddings, which features entity recognition and filtering, as well as relation classification with a localized, marker-free context representation. The model is trained using strong within-sentence negative samples, which are efficiently extracted in a single BERT pass. These aspects facilitate a search over all spans in the sentence. In ablation studies, we demonstrate the benefits of pre-training, strong negative sampling and localized context. Our model outperforms prior work by up to 2.6% F1 score on several datasets for joint entity and relation extraction.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| SciERC | SpERT | F1 | 70.33 | — | Unverified |