SOTAVerified

VEE-BERT: Accelerating BERT Inference for Named Entity Recognition via Vote Early Exiting

2022-01-16ACL ARR January 2022Unverified0· sign in to hype

Anonymous

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Named entity recognition (NER) is of great importance for a wide range of tasks, such as medical health record understanding, document analysis, dialogue understanding. BERT and its variants are the most performing models for NER. However, these models are notorious for being large and slow during inference. Thus their usage in the industry is limited. Pilot experiments exhibit that in the NER task, BERT suffers from the severe over-thinking problem, thus motivating BERT to exit early at intermediate layers. Thus, in this work, we propose a novel method, Vote Early Exiting BERT (VEE-BERT), for improving the early exiting of BERT on NER tasks. To be able to deal with complex NER tasks with nested entities, we adopt the Biaffine NER model yu-etal-2020-named, which converts a sequence labeling task to the table filling task. VEE-BERT makeS early exiting decisions by comparing the predictions of the current layer with those of the previous layers. Experiments on six benchmark NER tasks demonstrate that our method is effective in accelerating the BERT Biaffine model's inference speed with less performance loss compared to the baseline early exiting method.

Tasks

Reproductions