Entity Linking
Assigning a unique identity to entities (such as famous individuals, locations, or companies) mentioned in text (Source: Wikipedia).
Papers
Showing 1–10 of 735 papers
All datasetsAIDA-CoNLLKILT: AIDA-YAGO2KILT: WNED-CWEBKILT: WNED-WIKIDerczynskiMSNBCEC-FUNSDN3-Reuters-128OKE-2015OKE-2016KORE50CoNLL-Aida
Benchmark Results
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | SpEL-large (2023) | Micro-F1 strong | 88.6 | — | Unverified |
| 2 | SpEL-base (2023) | Micro-F1 strong | 88.1 | — | Unverified |
| 3 | FusionED | Micro-F1 strong | 86.5 | — | Unverified |
| 4 | ReLiK-Large | Micro-F1 strong | 86.4 | — | Unverified |
| 5 | Zhang et al. (2021) | Micro-F1 strong | 85.8 | — | Unverified |
| 6 | De Cao et al. (2021b) | Micro-F1 strong | 85.5 | — | Unverified |
| 7 | ReLiK-Base | Micro-F1 strong | 85.3 | — | Unverified |
| 8 | ReFinED | Micro-F1 strong | 84 | — | Unverified |
| 9 | De Cao et al. (2021a) | Micro-F1 strong | 83.7 | — | Unverified |
| 10 | Kannan Ravi et al. (2021) | Micro-F1 strong | 83.1 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | GENRE | KILT-AC | 89.85 | — | Unverified |
| 2 | BLINK | KILT-AC | 81.54 | — | Unverified |
| 3 | BART | KILT-AC | 77.55 | — | Unverified |
| 4 | BART + DPR | KILT-AC | 75.49 | — | Unverified |
| 5 | T5-base | KILT-AC | 74.05 | — | Unverified |
| 6 | RAG | KILT-AC | 72.62 | — | Unverified |
| 7 | multitask | KILT-AC | 66.75 | — | Unverified |
| 8 | Multitask DPR + BART | KILT-AC | 24.67 | — | Unverified |
| 9 | Multi-task DPR | KILT-AC | 0 | — | Unverified |
| 10 | multi-task small | KILT-AC | 0 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | GENRE | KILT-AC | 71.22 | — | Unverified |
| 2 | BLINK | KILT-AC | 68.77 | — | Unverified |
| 3 | T5-base | KILT-AC | 49.29 | — | Unverified |
| 4 | BART | KILT-AC | 49.16 | — | Unverified |
| 5 | RAG | KILT-AC | 47.61 | — | Unverified |
| 6 | multitask | KILT-AC | 47.45 | — | Unverified |
| 7 | BART + DPR | KILT-AC | 46.87 | — | Unverified |
| 8 | Multi-task DPR | KILT-AC | 0 | — | Unverified |
| 9 | chriskuei | KILT-AC | 0 | — | Unverified |
| 10 | multi-task small | KILT-AC | 0 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | GENRE | KILT-AC | 87.44 | — | Unverified |
| 2 | BLINK | KILT-AC | 80.24 | — | Unverified |
| 3 | RAG | KILT-AC | 48.07 | — | Unverified |
| 4 | T5-base | KILT-AC | 47.13 | — | Unverified |
| 5 | multitask | KILT-AC | 46.68 | — | Unverified |
| 6 | BART | KILT-AC | 45.91 | — | Unverified |
| 7 | BART + DPR | KILT-AC | 45.2 | — | Unverified |
| 8 | multi-task small | KILT-AC | 0 | — | Unverified |
| 9 | chriskuei | KILT-AC | 0 | — | Unverified |
| 10 | Multi-task DPR | KILT-AC | 0 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | FusionED | Micro-F1 strong | 56.8 | — | Unverified |
| 2 | ReLiK-Large | Micro-F1 | 56.3 | — | Unverified |
| 3 | ReLiK-Base | Micro-F1 | 55.6 | — | Unverified |
| 4 | De Cao et al. (2021a) | Micro-F1 | 54.1 | — | Unverified |
| 5 | ReFinED | Micro-F1 | 50.7 | — | Unverified |
| 6 | van Hulst et al. (2020) | Micro-F1 | 41.1 | — | Unverified |
| 7 | Kolitsas et al. (2018) | Micro-F1 | 34.1 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Kannan Ravi et al. (2021) | Micro-F1 | 83.4 | — | Unverified |
| 2 | ReLiK-Large | Micro-F1 | 75 | — | Unverified |
| 3 | De Cao et al. (2021a) | Micro-F1 | 73.7 | — | Unverified |
| 4 | FusionED | Micro-F1 strong | 73.6 | — | Unverified |
| 5 | Kolitsas et al. (2018) | Micro-F1 | 72.4 | — | Unverified |
| 6 | van Hulst et al. (2020) | Micro-F1 | 72.4 | — | Unverified |
| 7 | ReLiK-Base | Micro-F1 | 72.3 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | RORE (GeoLayoutLM) | F1 | 87.42 | — | Unverified |
| 2 | GeoLayoutLM | F1 | 86.18 | — | Unverified |
| 3 | RORE (GeoLayoutLM) | F1 | 84.34 | — | Unverified |
| 4 | GeoLayoutLM | F1 | 83.62 | — | Unverified |
| 5 | RORE (LayoutLMv3-large) | F1 | 79.33 | — | Unverified |
| 6 | RORE (LayoutLMv3-base) | F1 | 73.64 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | ReFinED | Micro-F1 | 58.1 | — | Unverified |
| 2 | E2E | Micro-F1 | 54.6 | — | Unverified |
| 3 | FusionED | Micro-F1 strong | 53.1 | — | Unverified |
| 4 | ReLiK-Large | Micro-F1 | 51.7 | — | Unverified |
| 5 | ReLiK-Base | Micro-F1 | 48.1 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | E2E | Micro-F1 | 66.9 | — | Unverified |
| 2 | ReLiK-Large | Micro-F1 | 65.1 | — | Unverified |
| 3 | ReFinED | Micro-F1 | 65 | — | Unverified |
| 4 | ReLiK-Base | Micro-F1 | 62.5 | — | Unverified |
| 5 | FusionED | Micro-F1 strong | 62.3 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | ReFinED | Micro-F1 | 59.5 | — | Unverified |
| 2 | E2E | Micro-F1 | 58.4 | — | Unverified |
| 3 | ReLiK-Large | Micro-F1 | 57.2 | — | Unverified |
| 4 | FusionED | Micro-F1 strong | 56.6 | — | Unverified |
| 5 | ReLiK-Base | Micro-F1 | 52.3 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | ReLiK-Large | Micro-F1 | 72.8 | — | Unverified |
| 2 | ReLiK-Base | Micro-F1 | 68 | — | Unverified |
| 3 | ReFinED | Micro-F1 | 65.9 | — | Unverified |
| 4 | FusionED | Micro-F1 strong | 65.1 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | RELIC + CoNLL-Aida tuning | Accuracy | 94.9 | — | Unverified |
| 2 | Raiman & Raiman 2018 | Accuracy | 94.9 | — | Unverified |
| 3 | Radhakrishnan et al. 2018 | Accuracy | 93 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | ArboEL | Accuracy | 75.73 | — | Unverified |
| 2 | ArboEL-dual | Accuracy | 72.19 | — | Unverified |
| 3 | BioBART | Accuracy | 71.78 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | ReLiK-Large | Micro-F1 | 43 | — | Unverified |
| 2 | FusionED | Micro-F1 strong | 41.6 | — | Unverified |
| 3 | ReLiK-Base | Micro-F1 | 41.6 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Raiman & Raiman 2018 | Accuracy | 90.9 | — | Unverified |
| 2 | RELIC + CoNLL-Aida tuning | Accuracy | 89.8 | — | Unverified |
| 3 | Radhakrishnan et al. 2018 | Accuracy | 89.6 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | SpEL-large (2023) | Micro-F1 strong | 77.5 | — | Unverified |
| 2 | SpEL-base (2023) | Micro-F1 strong | 73.7 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | Sieve-based+SapBERT | F1-score (strict) | 0.83 | — | Unverified |
| 2 | Sieve-based | F1-score (strict) | 0.81 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | SINGU_GROUP | F1 | 70.51 | — | Unverified |
| 2 | SERA | F1 | 65.96 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | SemEHR+WS (rules+BlueBERT) with tuning number of training data | F1 | 0.71 | — | Unverified |
| 2 | SemEHR+WS (rules+BlueBERT) | F1 | 0.7 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | SemEHR+WS (rules+BlueBERT) with tuning number of training data | F1 | 0.86 | — | Unverified |
| 2 | SemEHR+WS (rules+BlueBERT) | F1 | 0.86 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | transformers | Task 1 Accuracy: all | 77.8 | — | Unverified |
| 2 | CTLR | Task 1 Accuracy: all | 76.8 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | ArboEL | Unnormalized Accuracy | 62.53 | — | Unverified |
| 2 | ArboEL-dual | Unnormalized Accuracy | 51.09 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | ERNIE | Accuracy | 57.19 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | baseline | F1 | 26.4 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | SemEHR+WS (rules+BlueBERT) with tuning number of training data | F1 | 0.91 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | ReLiK-Large | Micro-F1 | 85.1 | — | Unverified |