SOTAVerified

Joint Semantic and Distributional Word Representations with Multi-Graph Embeddings

2019-11-01WS 2019Unverified0· sign in to hype

Pierre Daix-Moreux, Matthias Gall{\'e}

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Word embeddings continue to be of great use for NLP researchers and practitioners due to their training speed and easiness of use and distribution. Prior work has shown that the representation of those words can be improved by the use of semantic knowledge-bases. In this paper we propose a novel way of combining those knowledge-bases while the lexical information of co-occurrences of words remains. It is conceptually clear, as it consists in mapping both distributional and semantic information into a multi-graph and modifying existing node embeddings techniques to compute word representations. Our experiments show improved results compared to vanilla word embeddings, retrofitting and concatenation techniques using the same information, on a variety of data-sets of word similarities.

Tasks

Reproductions