SOTAVerified

Transformer-based Models for Long Document Summarisation in Financial Domain

2022-06-01FNP (LREC) 2022Unverified0· sign in to hype

Urvashi Khanna, Samira Ghodratnama, Diego Moll ́a, Amin Beheshti

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Summarisation of long financial documents is a challenging task due to the lack of large-scale datasets and the need for domain knowledge experts to create human-written summaries. Traditional summarisation approaches that generate a summary based on the content cannot produce summaries comparable to human-written ones and thus are rarely used in practice. In this work, we use the Longformer-Encoder-Decoder (LED) model to handle long financial reports. We describe our experiments and participating systems in the financial narrative summarisation shared task. Multi-stage fine-tuning helps the model generalise better on niche domains and avoids the problem of catastrophic forgetting. We further investigate the effect of the staged fine-tuning approach on the FNS dataset. Our systems achieved promising results in terms of ROUGE scores on the validation dataset.

Tasks

Reproductions