SOTAVerified

Improving Neural Knowledge Base Completion with Cross-Lingual Projections

2017-04-01EACL 2017Unverified0· sign in to hype

Patrick Klein, Simone Paolo Ponzetto, Goran Glava{\v{s}}

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In this paper we present a cross-lingual extension of a neural tensor network model for knowledge base completion. We exploit multilingual synsets from BabelNet to translate English triples to other languages and then augment the reference knowledge base with cross-lingual triples. We project monolingual embeddings of different languages to a shared multilingual space and use them for network initialization (i.e., as initial concept embeddings). We then train the network with triples from the cross-lingually augmented knowledge base. Results on WordNet link prediction show that leveraging cross-lingual information yields significant gains over exploiting only monolingual triples.

Tasks

Reproductions