SOTAVerified

A Comprehensive Analysis of Preprocessing for Word Representation Learning in Affective Tasks

2020-07-01ACL 2020Unverified0· sign in to hype

Nastaran Babanejad, Ameeta Agrawal, Aijun An, Manos Papagelis

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Affective tasks such as sentiment analysis, emotion classification, and sarcasm detection have been popular in recent years due to an abundance of user-generated data, accurate computational linguistic models, and a broad range of relevant applications in various domains. At the same time, many studies have highlighted the importance of text preprocessing, as an integral step to any natural language processing prediction model and downstream task. While preprocessing in affective systems is well-studied, preprocessing in word vector-based models applied to affective systems, is not. To address this limitation, we conduct a comprehensive analysis of the role of preprocessing techniques in affective analysis based on word vector models. Our analysis is the first of its kind and provides useful insights of the importance of each preprocessing technique when applied at the training phase, commonly ignored in pretrained word vector models, and/or at the downstream task phase.

Tasks

Reproductions