SOTAVerified

Multilinguals at SemEval-2022 Task 11: Transformer Based Architecture for Complex NER

2022-04-05SemEval (NAACL) 2022Code Available0· sign in to hype

Amit Pandey, Swayatta Daw, Vikram Pudi

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We investigate the task of complex NER for the English language. The task is non-trivial due to the semantic ambiguity of the textual structure and the rarity of occurrence of such entities in the prevalent literature. Using pre-trained language models such as BERT, we obtain a competitive performance on this task. We qualitatively analyze the performance of multiple architectures for this task. All our models are able to outperform the baseline by a significant margin. Our best performing model beats the baseline F1-score by over 9%.

Tasks

Reproductions