Entity Typing
Entity Typing is an important task in text analysis. Assigning types (e.g., person, location, organization) to mentions of entities in documents enables effective structured analysis of unstructured text corpora. The extracted type information can be used in a wide range of ways (e.g., serving as primitives for information extraction and knowledge base (KB) completion, and assisting question answering). Traditional Entity Typing systems focus on a small set of coarse types (typically fewer than 10). Recent studies work on a much larger set of fine-grained types which form a tree-structured hierarchy (e.g., actor as a subtype of artist, and artist is a subtype of person).
Source: Label Noise Reduction in Entity Typing by Heterogeneous Partial-Label Embedding
Image Credit: Label Noise Reduction in Entity Typing by Heterogeneous Partial-Label Embedding
Papers
Showing 71–80 of 170 papers
Benchmark Results
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | MLMET | F1 | 78.2 | — | Unverified |
| 2 | K-Adapter ( fac-adapter ) | F1 | 77.69 | — | Unverified |
| 3 | K-Adapter ( fac-adapter + lin-adapter ) | F1 | 77.61 | — | Unverified |
| 4 | ERNIE | F1 | 75.56 | — | Unverified |
| 5 | MCCE-B (replicated by Adaseq) | F1 | 52.1 | — | Unverified |
| 6 | Prompt + NPCRF (replicated by Adaseq) | F1 | 50.1 | — | Unverified |
| 7 | UniST-Large | F1 | 49.9 | — | Unverified |
| 8 | Prompt Learning (replicated by Adaseq)) | F1 | 49.3 | — | Unverified |
| 9 | MLMET | F1 | 49.1 | — | Unverified |
| 10 | RoBERTa-Large + NPCRF (replicated by Adaseq) | F1 | 47.3 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | MLMET | F1 | 49.1 | — | Unverified |
| 2 | ELMo (distant denoising data) | F1 | 40.2 | — | Unverified |
| 3 | LabelGCN Xiong et al. (2019) | F1 | 36.9 | — | Unverified |
| 4 | Choi et al. (2018) w augmentation | F1 | 32 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | ReFinED | Micro-F1 | 84 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | LITE | Macro F1 | 80.1 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | TextEnt-full | Accuracy | 37.4 | — | Unverified |
| # | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| 1 | LITE | Macro F1 | 86.6 | — | Unverified |