SOTAVerified

Augmenting and Tuning Knowledge Graph Embeddings

2019-07-01Code Available0· sign in to hype

Robert Bamler, Farnood Salehi, Stephan Mandt

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Knowledge graph embeddings rank among the most successful methods for link prediction in knowledge graphs, i.e., the task of completing an incomplete collection of relational facts. A downside of these models is their strong sensitivity to model hyperparameters, in particular regularizers, which have to be extensively tuned to reach good performance [Kadlec et al., 2017]. We propose an efficient method for large scale hyperparameter tuning by interpreting these models in a probabilistic framework. After a model augmentation that introduces per-entity hyperparameters, we use a variational expectation-maximization approach to tune thousands of such hyperparameters with minimal additional cost. Our approach is agnostic to details of the model and results in a new state of the art in link prediction on standard benchmark data.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
FB15kDistMult (after variational EM)MRR0.84Unverified
FB15k-237DistMult (after variational EM)Hits@100.55Unverified
WN18DistMult (after variational EM)MRR0.91Unverified
WN18RRDistMult (after variational EM)MRR0.46Unverified

Reproductions