Recursive Tree Attention: Improving Semantic Representations with Syntactic Tree Structured Attention Mechanism
Anonymous
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Attention mechanism has shown its effectiveness in state-of-the-art methods on various tasks in natural language processing (NLP). However, these methods are still using attention mechanism in plain, linear topological structures while the syntactic structures of natural languages are known to be hierarchical. Modeling in such syntactic structures like syntactic trees is proved to be superior in semantic representation learning. In this paper, to improve semantic representation learning by modelling attention mechanism in syntactic structures, we propose recursive tree attention, a syntactic tree structured attention mechanism which recursively calculate attention weights and summarize information through constituency parsing trees. The information of the children is cursively summarized to their parent through the constituency parsing tree. Then the summarized information from the root node is used as the semantic representations of the current sentence. Experimental results on the text classification tasks demonstrate the effectiveness of our proposed attention mechanism. The approaches with recursive tree attention outperform conventional attention mechanism and other syntactic tree based approaches.