SOTAVerified

Enhancing Aspect-level Sentiment Analysis with Word Dependencies

2021-04-01EACL 2021Code Available1· sign in to hype

Yuanhe Tian, Guimin Chen, Yan Song

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Aspect-level sentiment analysis (ASA) has received much attention in recent years. Most existing approaches tried to leverage syntactic information, such as the dependency parsing results of the input text, to improve sentiment analysis on different aspects. Although these approaches achieved satisfying results, their main focus is to leverage the dependency arcs among words where the dependency type information is omitted; and they model different dependencies equally where the noisy dependency results may hurt model performance. In this paper, we propose an approach to enhance aspect-level sentiment analysis with word dependencies, where the type information is modeled by key-value memory networks and different dependency results are selectively leveraged. Experimental results on five benchmark datasets demonstrate the effectiveness of our approach, where it outperforms baseline models on all datasets and achieves state-of-the-art performance on three of them.

Tasks

Reproductions