SOTAVerified

Alleviating the Sparsity of Open Knowledge Graphs with Pretrained Contrastive Learning

2021-11-16ACL ARR November 2021Unverified0· sign in to hype

Anonymous

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Due to the sparsity of formal knowledge and the roughness of non-ontological construction methods, relevant facts are often missing in Open Knowledge Graphs (OpenKGs). Although existing completion methods have achieved promising performance, they do not alleviate the sparsity problem of OpenKGs. Owing to fewer training chances caused by sparse links, many few-shot and zero-shot entities cannot fully learn high-dimensional features. In this paper, we propose a new OpenKG Contrastive Learning (OKGCL) model to alleviate the sparsity with contrastive entities and relations. OKGCL designs (a) negative entities to discriminate different entities with the same relation, (b) negative relations to discriminate different relations with the same entity-pair, and (c) self positive samples to give zero-shot and few-shot entities chances to learn discriminative representations. Extensive experiments on benchmark datasets show the superiority of OKGCL over state-of-the-art models.

Tasks

Reproductions