SOTAVerified

Investigating Different Syntactic Context Types and Context Representations for Learning Word Embeddings

2017-09-01EMNLP 2017Unverified0· sign in to hype

Bofang Li, Tao Liu, Zhe Zhao, Buzhou Tang, Aleks Drozd, R, Anna Rogers, Xiaoyong Du

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

The number of word embedding models is growing every year. Most of them are based on the co-occurrence information of words and their contexts. However, it is still an open question what is the best definition of context. We provide a systematical investigation of 4 different syntactic context types and context representations for learning word embeddings. Comprehensive experiments are conducted to evaluate their effectiveness on 6 extrinsic and intrinsic tasks. We hope that this paper, along with the published code, would be helpful for choosing the best context type and representation for a given task.

Tasks

Reproductions