SOTAVerified

Exploring Pre-Trained Transformers and Bilingual Transfer Learning for Arabic Coreference Resolution

2021-11-01CRAC (ACL) 2021Code Available0· sign in to hype

Bonan Min

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

In this paper, we develop bilingual transfer learning approaches to improve Arabic coreference resolution by leveraging additional English annotation via bilingual or multilingual pre-trained transformers. We show that bilingual transfer learning improves the strong transformer-based neural coreference models by 2-4 F1. We also systemically investigate the effectiveness of several pre-trained transformer models that differ in training corpora, languages covered, and model capacity. Our best model achieves a new state-of-the-art performance of 64.55 F1 on the Arabic OntoNotes dataset. Our code is publicly available at https://github.com/bnmin/arabic_coref.

Tasks

Reproductions