SOTAVerified

A memory enhanced LSTM for modeling complex temporal dependencies

2019-10-25Unverified0· sign in to hype

Sneha Aenugu

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In this paper, we present Gamma-LSTM, an enhanced long short term memory (LSTM) unit, to enable learning of hierarchical representations through multiple stages of temporal abstractions. Gamma memory, a hierarchical memory unit, forms the central memory of Gamma-LSTM with gates to regulate the information flow into various levels of hierarchy, thus providing the unit with a control to pick the appropriate level of hierarchy to process the input at a given instant of time. We demonstrate better performance of Gamma-LSTM model regular and stacked LSTMs in two settings (pixel-by-pixel MNIST digit classification and natural language inference) placing emphasis on the ability to generalize over long sequences.

Tasks

Reproductions