SOTAVerified

ReadE: Learning Relation-Dependent Entity Representation for Knowledge Graph Completion

2021-11-16ACL ARR November 2021Unverified0· sign in to hype

Anonymous

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Conventional knowledge graph embedding methods learn semantic representations for entities considering their intrinsic interactions through powerful graph neural networks. However, previous methods represent each node solely with a coarse-grained unique representation, regardless of the variance of emphasis of entity semantics by different relations. To tackle this problem, we propose ReadE, a method to learn relation-dependent entity representations of which the semantic information is emphasized by varied relations types. First, we propose a relation-controlled gating mechanism targeting on utilizing the relation to control the information flow in the aggregation step of the graph neural network. Second, we propose a contrastive learning method with mixing both relation-level and entity-level negative samples to enhance semantics preserved in relation-dependent entity representations. Experiments on three benchmarks show that our proposed model outperforms all strong baselines. The code will be made open-sourced on Github.

Tasks

Reproductions