SOTAVerified

KERMIT: Knowledge Graph Completion of Enhanced Relation Modeling with Inverse Transformation

2023-09-26Code Available1· sign in to hype

Haotian Li, Bin Yu, Yuliang Wei, Kai Wang, Richard Yi Da Xu, Bailing Wang

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Knowledge graph completion (KGC) revolves around populating missing triples in a knowledge graph using available information. Text-based methods, which depend on textual descriptions of triples, often encounter difficulties when these descriptions lack sufficient information for accurate prediction-an issue inherent to the datasets and not easily resolved through modeling alone. To address this and ensure data consistency, we first use large language models (LLMs) to generate coherent descriptions, bridging the semantic gap between queries and answers. Secondly, we utilize inverse relations to create a symmetric graph, thereby providing augmented training samples for KGC. Additionally, we employ the label information inherent in knowledge graphs (KGs) to enhance the existing contrastive framework, making it fully supervised. These efforts have led to significant performance improvements on the WN18RR and FB15k-237 datasets. According to standard evaluation metrics, our approach achieves a 4.2% improvement in Hit@1 on WN18RR and a 3.4% improvement in Hit@3 on FB15k-237, demonstrating superior performance.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
FB15k-237KERMITHits@10.27Unverified
WN18RRKERMITHits@100.83Unverified

Reproductions