MEIM: Multi-partition Embedding Interaction Beyond Block Term Format for Efficient and Expressive Link Prediction
Hung Nghiep Tran, Atsuhiro Takasu
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/tranhungnghiep/meim-kgeOfficialIn paperpytorch★ 24
- github.com/tranhungnghiep/AnalyzingKGEmbeddingstf★ 28
- github.com/tranhungnghiep/AnalyzeKGEtf★ 28
- github.com/tranhungnghiep/MEI-KGEpytorch★ 6
Abstract
Knowledge graph embedding aims to predict the missing relations between entities in knowledge graphs. Tensor-decomposition-based models, such as ComplEx, provide a good trade-off between efficiency and expressiveness, that is crucial because of the large size of real world knowledge graphs. The recent multi-partition embedding interaction (MEI) model subsumes these models by using the block term tensor format and provides a systematic solution for the trade-off. However, MEI has several drawbacks, some of which carried from its subsumed tensor-decomposition-based models. In this paper, we address these drawbacks and introduce the Multi-partition Embedding Interaction iMproved beyond block term format (MEIM) model, with independent core tensor for ensemble effects and soft orthogonality for max-rank mapping, in addition to multi-partition embedding. MEIM improves expressiveness while still being highly efficient, helping it to outperform strong baselines and achieve state-of-the-art results on difficult link prediction benchmarks using fairly small embedding sizes. The source code is released at https://github.com/tranhungnghiep/MEIM-KGE.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| FB15k-237 | MEIM | Hits@1 | 0.27 | — | Unverified |
| WN18RR | MEIM | Hits@10 | 0.58 | — | Unverified |
| YAGO3-10 | MEIM | Hits@1 | 0.51 | — | Unverified |