SOTAVerified

HEGEL: Hypergraph Transformer for Long Document Summarization

2022-10-09Code Available1· sign in to hype

Haopeng Zhang, Xiao Liu, Jiawei Zhang

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Extractive summarization for long documents is challenging due to the extended structured input context. The long-distance sentence dependency hinders cross-sentence relations modeling, the critical step of extractive summarization. This paper proposes HEGEL, a hypergraph neural network for long document summarization by capturing high-order cross-sentence relations. HEGEL updates and learns effective sentence representations with hypergraph transformer layers and fuses different types of sentence dependencies, including latent topics, keywords coreference, and section structure. We validate HEGEL by conducting extensive experiments on two benchmark datasets, and experimental results demonstrate the effectiveness and efficiency of HEGEL.

Tasks

Reproductions