SOTAVerified

BERT Fine-tuning For Arabic Text Summarization

2020-03-29Code Available0· sign in to hype

Khalid N. Elmadani, Mukhtar Elgezouli, Anas Showk

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Fine-tuning a pretrained BERT model is the state of the art method for extractive/abstractive text summarization, in this paper we showcase how this fine-tuning method can be applied to the Arabic language to both construct the first documented model for abstractive Arabic text summarization and show its performance in Arabic extractive summarization. Our model works with multilingual BERT (as Arabic language does not have a pretrained BERT of its own). We show its performance in English corpus first before applying it to Arabic corpora in both extractive and abstractive tasks.

Tasks

Reproductions