SOTAVerified

Rotate King to get Queen: Word Relationships as Orthogonal Transformations in Embedding Space

2019-09-02IJCNLP 2019Unverified0· sign in to hype

Kawin Ethayarajh

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

A notable property of word embeddings is that word relationships can exist as linear substructures in the embedding space. For example, gender corresponds to woman - man and queen - king. This, in turn, allows word analogies to be solved arithmetically: king - man + woman queen. This property is notable because it suggests that models trained on word embeddings can easily learn such relationships as geometric translations. However, there is no evidence that models exclusively represent relationships in this manner. We document an alternative way in which downstream models might learn these relationships: orthogonal and linear transformations. For example, given a translation vector for gender, we can find an orthogonal matrix R, representing a rotation and reflection, such that R(king) queen and R(man) woman. Analogical reasoning using orthogonal transformations is almost as accurate as using vector arithmetic; using linear transformations is more accurate than both. Our findings suggest that these transformations can be as good a representation of word relationships as translation vectors.

Tasks

Reproductions