Chinese Word Attention based on Valid Division of Sentence
2021-11-16ACL ARR November 2021Unverified0· sign in to hype
Anonymous
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Chinese word attention (CWA) with word-level information is very important for natural language processing. The purpose is how to attention words in a sentence. We first explore the valid divisions of a sentence by splitting word tools. We use BERT for character and word pre-training. Each character embedding with its word in one division is encoded in block local attention. We use attention with prior to assign attention weights to each splitting result, and finally combine the global attention mechanism to get the optimal recognition result in Chinese NER.