SOTAVerified

AttentionRank: Unsupervised Keyphrase Extraction using Self and Cross Attentions

2021-11-01EMNLP 2021Code Available1· sign in to hype

Haoran Ding, Xiao Luo

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Keyword or keyphrase extraction is to identify words or phrases presenting the main topics of a document. This paper proposes the AttentionRank, a hybrid attention model, to identify keyphrases from a document in an unsupervised manner. AttentionRank calculates self-attention and cross-attention using a pre-trained language model. The self-attention is designed to determine the importance of a candidate within the context of a sentence. The cross-attention is calculated to identify the semantic relevance between a candidate and sentences within a document. We evaluate the AttentionRank on three publicly available datasets against seven baselines. The results show that the AttentionRank is an effective and robust unsupervised keyphrase extraction model on both long and short documents. Source code is available on Github.

Tasks

Reproductions