SOTAVerified

NLPRL System for Very Low Resource Supervised Machine Translation

2020-11-01WMT (EMNLP) 2020Unverified0· sign in to hype

Rupjyoti Baruah, Rajesh Kumar Mundotiya, Amit Kumar, Anil Kumar Singh

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This paper describes the results of the system that we used for the WMT20 very low resource (VLR) supervised MT shared task. For our experiments, we use a byte-level version of BPE, which requires a base vocabulary of size 256 only. BPE based models are a kind of sub-word models. Such models try to address the Out of Vocabulary (OOV) word problem by performing word segmentation so that segments correspond to morphological units. They are also reported to work across different languages, especially similar languages due to their sub-word nature. Based on BLEU cased score, our NLPRL systems ranked ninth for HSB to GER and tenth in GER to HSB translation scenario.

Tasks

Reproductions