SOTAVerified

Novel Chapter Abstractive Summarization using Spinal Tree Aware Sub-Sentential Content Selection

2022-01-16ACL ARR January 2022Unverified0· sign in to hype

Anonymous

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Summarizing novel chapters is a difficult task due to the length of the chapter to be summarized and the fact that summary sentences draw content from multiple sentences in the chapter. We present a pipelined extractive-abstractive approach where the extractive step filters the content that is passed to the abstractive component. Extremely lengthy input also results in a dataset highly skewed towards negative instances and we thus adopt a margin ranking loss for extraction to encourage separation between positive and negative input. To generate summary sentences that fuse information from different sentences, our extraction component operates at the constituent level; our novel approach to this problem enriches the text with spinal tree information which provides context to the extraction model. We show an improvement of 3.71 Rouge-1 points over the state-of-the-art on an existing novel chapter dataset.

Tasks

Reproductions