SOTAVerified

Joint Unsupervised Learning of Semantic Representation of Words and Roles in Dependency Trees

2017-09-01RANLP 2017Unverified0· sign in to hype

Michal Konkol

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In this paper, we introduce WoRel, a model that jointly learns word embeddings and a semantic representation of word relations. The model learns from plain text sentences and their dependency parse trees. The word embeddings produced by WoRel outperform Skip-Gram and GloVe in word similarity and syntactical word analogy tasks and have comparable results on word relatedness and semantic word analogy tasks. We show that the semantic representation of relations enables us to express the meaning of phrases and is a promising research direction for semantics at the sentence level.

Tasks

Reproductions