SOTAVerified

KG-BERT: BERT for Knowledge Graph Completion

2019-09-07Code Available0· sign in to hype

Liang Yao, Chengsheng Mao, Yuan Luo

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Knowledge graphs are important resources for many artificial intelligence tasks but often suffer from incompleteness. In this work, we propose to use pre-trained language models for knowledge graph completion. We treat triples in knowledge graphs as textual sequences and propose a novel framework named Knowledge Graph Bidirectional Encoder Representations from Transformer (KG-BERT) to model these triples. Our method takes entity and relation descriptions of a triple as input and computes scoring function of the triple with the KG-BERT language model. Experimental results on multiple benchmark knowledge graphs show that our method can achieve state-of-the-art performance in triple classification, link prediction and relation prediction tasks.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
FB15k-237KG-BERTHits@100.42Unverified
UMLSKG-BERTHits@100.99Unverified
WN18RRKG-BERTHits@100.52Unverified

Reproductions