SOTAVerified

Zero-Resource Cross-Lingual Named Entity Recognition

2019-11-22Code Available0· sign in to hype

M Saiful Bari, Shafiq Joty, Prathyusha Jwalapuram

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Recently, neural methods have achieved state-of-the-art (SOTA) results in Named Entity Recognition (NER) tasks for many languages without the need for manually crafted features. However, these models still require manually annotated training data, which is not available for many languages. In this paper, we propose an unsupervised cross-lingual NER model that can transfer NER knowledge from one language to another in a completely unsupervised way without relying on any bilingual dictionary or parallel data. Our model achieves this through word-level adversarial learning and augmented fine-tuning with parameter sharing and feature augmentation. Experiments on five different languages demonstrate the effectiveness of our approach, outperforming existing models by a good margin and setting a new SOTA for each language pair.

Tasks

Reproductions