Impart Contextualization to Static Word Embeddings through Semantic Relations
2022-01-16ACL ARR January 2022Unverified0· sign in to hype
Anonymous
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Dense word embedding is the foundational model for the downstream NLP research. It encodes the meanings of words into low dimensional vector spaces. Recent models with the start-of-the-art performances mostly adopt the contextualized word embeddings, which can distinguish the various meanings of the words by their dynamic context. To impart the information of context to the static word embeddings, we formulate 3 semantic relations: interchangeable, opposite and relative relation to find a sub-set of dimensions for interpreting the specific context. The experiment shows that the relations can be mined from fastText embedding.