Clause Attention based on Signal Words Division
2021-11-16ACL ARR November 2021Unverified0· sign in to hype
Anonymous
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Clause Attention (CA) is very important for long sentences processing. We build and label datasets for signal word training. According to the position of signal word, the long sentences are divided into clauses which are assigned to additional block attention. The original sentence is mapped and fed into the shared encoder to learn the extraneous representation of words in its clause sentences. We use attention with prior to balance global attention with local attention. It improves the quality of long sentence processing in NER and NMT task.