SOTAVerified

Mixture-of-Graphs: Zero-shot Relational Learning for Knowledge Graph by Fusing Ontology and Textual Experts

2021-11-16ACL ARR November 2021Unverified0· sign in to hype

Anonymous

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Knowledge Graph Embedding (KGE) have been proposed and succeed utilized to knowledge Graph Completion (KGC). But dominant KGE models often fail in zero-shot relational learning because they cannot learn effective representations for unseen relations. Previous studies mainly separately utilize the textual description of relation and its neighbor relations to represent unseen relations. In fact, the semantics of a relation can be expressed by three kinds of graphs: factual graph, ontology graph and textual description graph, and they can complement and enhance each other. Therefore, to obtain more accurate representation of relation in zero-shot learning, we propose the mixture-of-graphs (MoG) experts to improve the effect of current KGE for unseen relations. We build multi-aspect associations between seen and unseen relations which will be used directly to guide previous KGE methods such as TransE and RotatE on zero-shot relational learning. The experiments on multiple public datasets verify the effectiveness of the proposed method, which improves the state-of-the-art zero-shot relational learning method by 12.84% in Hits@10 on average.

Tasks

Reproductions