SOTAVerified

RotatE: Knowledge Graph Embedding by Relational Rotation in Complex Space

2019-02-26ICLR 2019Code Available1· sign in to hype

Zhiqing Sun, Zhi-Hong Deng, Jian-Yun Nie, Jian Tang

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We study the problem of learning representations of entities and relations in knowledge graphs for predicting missing links. The success of such a task heavily relies on the ability of modeling and inferring the patterns of (or between) the relations. In this paper, we present a new approach for knowledge graph embedding called RotatE, which is able to model and infer various relation patterns including: symmetry/antisymmetry, inversion, and composition. Specifically, the RotatE model defines each relation as a rotation from the source entity to the target entity in the complex vector space. In addition, we propose a novel self-adversarial negative sampling technique for efficiently and effectively training the RotatE model. Experimental results on multiple benchmark knowledge graphs show that the proposed RotatE model is not only scalable, but also able to infer and model various relation patterns and significantly outperform existing state-of-the-art models for link prediction.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
FB122RotatEHITS@370.8Unverified
FB15kpRotatEMRR0.8Unverified
FB15kRotatEMRR0.8Unverified
FB15k-237pRotatEHits@10.23Unverified
WN18RotatEHits@100.96Unverified
WN18pRotatEHits@100.96Unverified
WN18RRRotatEHits@100.57Unverified
WN18RRpRotatEHits@100.55Unverified

Reproductions