SOTAVerified

Retrofitting of Pre-trained Emotion Words with VAD-dimensions and the Plutchik Emotions

2021-12-01ICON 2021Unverified0· sign in to hype

Manasi Kulkarni, Pushpak Bhattacharyya

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

The word representations are based on distributional hypothesis according to which words that occur in the similar contexts, tend to have a similar meaning and appear closer in vector space. For example, the emotionally dissimilar words ”joy” and ”sadness” have higher cosine similarity. The existing pre-trained embedding models lack in emotional words interpretations. For creating our VAD-Emotion embeddings, we modify the pre-trained word embeddings with emotion information. This is a lexicons based approach that uses the Valence, Arousal and Dominance (VAD) values, and the Plutchik’s emotions to incorporate the emotion information in pre-trained word embeddings using post-training processing. This brings emotionally similar words nearer and emotionally dissimilar words away from each other in the proposed vector space. We demonstrate the performance of proposed embedding through NLP downstream task - Emotion Recognition.

Tasks

Reproductions