SOTAVerified

GYM at Qur’an QA 2023 Shared Task: Multi-Task Transfer Learning for Quranic Passage Retrieval and Question Answering with Large Language Models

2023-12-07ArabicNLP at EMNLPCode Available0· sign in to hype

Ghazaleh Mahmoudi, Yeganeh Morshedzadeh, Sauleh Eetemadi

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

This work addresses the challenges of question answering for vintage texts like the Quran. It introduces two tasks: passage retrieval and reading comprehension. For passage retrieval, it employs unsupervised fine-tuning sentence encoders and supervised multi-task learning. In reading comprehension, it fine-tunes an Electra-based model, demonstrating significant improvements over baseline models. Our best AraElectra model achieves 46.1% partial Average Precision (pAP) on the unseen test set, outperforming the baseline by 23%.

Tasks

Reproductions