Relation Extraction
Relation Extraction is the task of predicting attributes and relations for entities in a sentence. For example, given a sentence “Barack Obama was born in Honolulu, Hawaii.”, a relation classifier aims at predicting the relation of “bornInCity”. Relation Extraction is the key component for building relation knowledge graphs, and it is of crucial significance to natural language processing applications such as structured search, sentiment analysis, question answering, and summarization.
Source: Deep Residual Learning for Weakly-Supervised Relation Extraction
Papers
Showing 1–10 of 1977 papers
All datasetsDocREDTACREDSemEval-2010 Task-8ACE 2005CoNLL04Adverse Drug Events (ADE) CorpusWebNLGChemProtACE 2004NYT11-HRLCDRGDA
Benchmark Results
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | DREEAM | F1 | 67.53 | — | Unverified |
| 2 | KD-Rb-l | F1 | 67.28 | — | Unverified |
| 3 | SSAN-RoBERTa-large+Adaptation | F1 | 65.92 | — | Unverified |
| 4 | SAIS-RoBERTa-large | F1 | 65.11 | — | Unverified |
| 5 | Eider-RoBERTa-large | F1 | 64.79 | — | Unverified |
| 6 | DocuNet-RoBERTa-large | F1 | 64.55 | — | Unverified |
| 7 | CGM2IR-RoBERTalarge | F1 | 63.89 | — | Unverified |
| 8 | SETE-Roberta-large | F1 | 63.74 | — | Unverified |
| 9 | ATLOP-RoBERTa-large | F1 | 63.4 | — | Unverified |
| 10 | DRE-MIR-BERTbase | F1 | 63.15 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | RAG4RE | F1 | 86.6 | — | Unverified |
| 2 | DeepStruct multi-task w/ finetune | F1 | 76.8 | — | Unverified |
| 3 | UNiST (LARGE) | F1 | 75.5 | — | Unverified |
| 4 | RE-MC | F1 | 75.4 | — | Unverified |
| 5 | GenPT (T5) | F1 | 75.3 | — | Unverified |
| 6 | RECENT+SpanBERT | F1 | 75.2 | — | Unverified |
| 7 | SuRE (PEGASUS-large) | F1 | 75.1 | — | Unverified |
| 8 | EXOBRAIN | F1 | 75 | — | Unverified |
| 9 | Relation Reduction | F1 | 74.8 | — | Unverified |
| 10 | RoBERTa-large-typed-marker | F1 | 74.6 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | SP | F1 | 91.9 | — | Unverified |
| 2 | RIFRE | F1 | 91.3 | — | Unverified |
| 3 | REDN | F1 | 91 | — | Unverified |
| 4 | SPOT | F1 | 90.6 | — | Unverified |
| 5 | KLG | F1 | 90.5 | — | Unverified |
| 6 | RELA | F1 | 90.4 | — | Unverified |
| 7 | Skeleton-Aware BERT | F1 | 90.36 | — | Unverified |
| 8 | KnowPrompt | F1 | 90.3 | — | Unverified |
| 9 | LUKE | F1 | 90.3 | — | Unverified |
| 10 | EPGNN | F1 | 90.2 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Span-level | NER Micro F1 | 85.98 | — | Unverified |
| 2 | Dual Pointer Network(multi-head) | Relation classification F1 | 80.8 | — | Unverified |
| 3 | Dual Pointer Network | Relation classification F1 | 80.5 | — | Unverified |
| 4 | PL-Marker | RE Micro F1 | 73 | — | Unverified |
| 5 | ASP+T5-3B | RE Micro F1 | 72.7 | — | Unverified |
| 6 | GoLLIE | RE Micro F1 | 70.1 | — | Unverified |
| 7 | Ours: cross-sentence ALB | RE Micro F1 | 69.4 | — | Unverified |
| 8 | MGE | RE+ Micro F1 | 68.2 | — | Unverified |
| 9 | HySPA (ours) w/ RoBERTa | Relation F1 | 68.2 | — | Unverified |
| 10 | RNN+CNN | Relation classification F1 | 67.7 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | ReLiK-Large | RE+ Micro F1 | 78.1 | — | Unverified |
| 2 | REBEL | RE+ Macro F1 | 76.65 | — | Unverified |
| 3 | ASP+T0-3B | RE+ Micro F1 | 76.3 | — | Unverified |
| 4 | Table-Sequence | RE+ Macro F1 | 75.4 | — | Unverified |
| 5 | SpERT | RE+ Macro F1 | 72.87 | — | Unverified |
| 6 | Deeper | RE+ Macro F1 | 72.63 | — | Unverified |
| 7 | TANL | RE+ Micro F1 | 72.6 | — | Unverified |
| 8 | TablERT | RE+ Micro F1 | 72.6 | — | Unverified |
| 9 | TriMF | RE+ Micro F1 | 72.35 | — | Unverified |
| 10 | Multi-turn QA | RE+ Micro F1 | 68.9 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | PFN (ALBERT XXL, average aggregation) | RE+ Macro F1 | 83.9 | — | Unverified |
| 2 | Deeper | RE+ Macro F1 | 83.74 | — | Unverified |
| 3 | PFN (ALBERT XXL, no aggregation) | RE+ Macro F1 | 83.2 | — | Unverified |
| 4 | SpERT.PL (without overlap and BioBERT) | RE+ Macro F1 | 82.39 | — | Unverified |
| 5 | REBEL (including overlapping entities) | RE+ Macro F1 | 82.2 | — | Unverified |
| 6 | SpERT.PL (with overlap and BioBERT) | RE+ Macro F1 | 82.03 | — | Unverified |
| 7 | CMAN | RE+ Macro F1 | 81.14 | — | Unverified |
| 8 | Table-Sequence | RE+ Macro F1 | 80.1 | — | Unverified |
| 9 | CLDR + CLNER | RE+ Macro F1 | 79.97 | — | Unverified |
| 10 | SpERT (without overlap) | RE+ Macro F1 | 79.24 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | UniRel | F1 | 94.7 | — | Unverified |
| 2 | PFN | F1 | 93.6 | — | Unverified |
| 3 | SPN | F1 | 93.4 | — | Unverified |
| 4 | TDEER | F1 | 93.1 | — | Unverified |
| 5 | RIFRE | F1 | 92.6 | — | Unverified |
| 6 | TPLinker | F1 | 91.9 | — | Unverified |
| 7 | HBT (CasRel) | F1 | 91.8 | — | Unverified |
| 8 | RIN (BERT, K=2) | F1 | 90.1 | — | Unverified |
| 9 | CGT(UniLM) | F1 | 83.4 | — | Unverified |