SOTAVerified

Improving Tree-LSTM with Tree Attention

2019-01-01Unverified0· sign in to hype

Mahtab Ahmed, Muhammad Rifayat Samee, Robert E. Mercer

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In Natural Language Processing (NLP), we often need to extract information from tree topology. Sentence structure can be represented via a dependency tree or a constituency tree structure. For this reason, a variant of LSTMs, named Tree-LSTM, was proposed to work on tree topology. In this paper, we design a generalized attention framework for both dependency and constituency trees by encoding variants of decomposable attention inside a Tree-LSTM cell. We evaluated our models on a semantic relatedness task and achieved notable results compared to Tree-LSTM based methods with no attention as well as other neural and non-neural methods and good results compared to Tree-LSTM based methods with attention.

Tasks

Reproductions