SOTAVerified

A Self-Attentive Hierarchical Model for Jointly Improving Text Summarization and Sentiment Classification

2018-11-14Proceedings of The 10th Asian Conference on Machine Learning 2018Unverified0· sign in to hype

Hongli Wang, Jiangtao Ren

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Text summarization and sentiment classification, in NLP, are two main tasks implemented on text analysis, focusing on extracting the major idea of a text at different levels. Based on the characteristics of both, sentiment classification can be regarded as a more abstrac- tive summarization task. According to the scheme, a Self-Attentive Hierarchical model for jointly improving text Summarization and Sentiment Classification (SAHSSC) is proposed in this paper. This model jointly performs abstractive text summarization and sentiment classification within a hierarchical end-to-end neural framework, in which the sentiment classification layer on top of the summarization layer predicts the sentiment label in the light of the text and the generated summary. Furthermore, a self-attention layer is also pro- posed in the hierarchical framework, which is the bridge that connects the summarization layer and the sentiment classification layer and aims at capturing emotional information at text-level as well as summary-level. The proposed model can generate a more relevant summary and lead to a more accurate summary-aware sentiment prediction. Experimental results evaluated on SNAP amazon online review datasets show that our model outper- forms the state-of-the-art baselines on both abstractive text summarization and sentiment classification by a considerable margin.

Tasks

Reproductions