KG-BERT: BERT for Knowledge Graph Completion
Liang Yao, Chengsheng Mao, Yuan Luo
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/yao8839836/kg-bertOfficialIn paperpytorch★ 0
- github.com/gychant/CSKMTermDefnpytorch★ 6
- github.com/ManasRMohanty/DS5500-capstonepytorch★ 0
Abstract
Knowledge graphs are important resources for many artificial intelligence tasks but often suffer from incompleteness. In this work, we propose to use pre-trained language models for knowledge graph completion. We treat triples in knowledge graphs as textual sequences and propose a novel framework named Knowledge Graph Bidirectional Encoder Representations from Transformer (KG-BERT) to model these triples. Our method takes entity and relation descriptions of a triple as input and computes scoring function of the triple with the KG-BERT language model. Experimental results on multiple benchmark knowledge graphs show that our method can achieve state-of-the-art performance in triple classification, link prediction and relation prediction tasks.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| FB15k-237 | KG-BERT | Hits@10 | 0.42 | — | Unverified |
| UMLS | KG-BERT | Hits@10 | 0.99 | — | Unverified |
| WN18RR | KG-BERT | Hits@10 | 0.52 | — | Unverified |