SOTAVerified

Quranic Verses Semantic Relatedness Using AraBERT

2021-04-01EACL (WANLP) 2021Unverified0· sign in to hype

Abdullah Alsaleh, Eric Atwell, Abdulrahman Altahhan

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Bidirectional Encoder Representations from Transformers (BERT) has gained popularity in recent years producing state-of-the-art performances across Natural Language Processing tasks. In this paper, we used AraBERT language model to classify pairs of verses provided by the QurSim dataset to either be semantically related or not. We have pre-processed The QurSim dataset and formed three datasets for comparisons. Also, we have used both versions of AraBERT, which are AraBERTv02 and AraBERTv2, to recognise which version performs the best with the given datasets. The best results was AraBERTv02 with 92% accuracy score using a dataset comprised of label ‘2’ and label '-1’, the latter was generated outside of QurSim dataset.

Tasks

Reproductions