SOTAVerified

Neural Arabic Text Diacritization: State of the Art Results and a Novel Approach for Machine Translation

2019-11-08WS 2019Code Available0· sign in to hype

Ali Fadel, Ibraheem Tuffaha, Bara' Al-Jawarneh, Mahmoud Al-Ayyoub

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

In this work, we present several deep learning models for the automatic diacritization of Arabic text. Our models are built using two main approaches, viz. Feed-Forward Neural Network (FFNN) and Recurrent Neural Network (RNN), with several enhancements such as 100-hot encoding, embeddings, Conditional Random Field (CRF) and Block-Normalized Gradient (BNG). The models are tested on the only freely available benchmark dataset and the results show that our models are either better or on par with other models, which require language-dependent post-processing steps, unlike ours. Moreover, we show that diacritics in Arabic can be used to enhance the models of NLP tasks such as Machine Translation (MT) by proposing the Translation over Diacritization (ToD) approach.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
TashkeelaShakkelhaDiacritic Error Rate0.02Unverified

Reproductions