TuckER: Tensor Factorization for Knowledge Graph Completion
Ivana Balažević, Carl Allen, Timothy M. Hospedales
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/ibalazevic/TuckEROfficialIn paperpytorch★ 0
- github.com/luffycodes/neptunepytorch★ 16
- github.com/adaruna3/explainable-kgepytorch★ 9
- github.com/allenai/kbpytorch★ 0
- github.com/Sujit-O/pykg2vectf★ 0
Abstract
Knowledge graphs are structured representations of real world facts. However, they typically contain only a small subset of all possible facts. Link prediction is a task of inferring missing facts based on existing ones. We propose TuckER, a relatively straightforward but powerful linear model based on Tucker decomposition of the binary tensor representation of knowledge graph triples. TuckER outperforms previous state-of-the-art models across standard link prediction datasets, acting as a strong baseline for more elaborate models. We show that TuckER is a fully expressive model, derive sufficient bounds on its embedding dimensionalities and demonstrate that several previously introduced linear models can be viewed as special cases of TuckER.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| FB15k | TuckER | MRR | 0.8 | — | Unverified |
| FB15k-237 | TuckER | Hits@1 | 0.27 | — | Unverified |
| WN18 | TuckER | Hits@10 | 0.96 | — | Unverified |
| WN18RR | TuckER | Hits@10 | 0.53 | — | Unverified |