SOTAVerified

RotLSTM: Rotating Memories in Recurrent Neural Networks

2021-05-01Unverified0· sign in to hype

Vlad Velici, Adam Prügel-Bennett

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Long Short-Term Memory (LSTM) units have the ability to memorise and use long-term dependencies between inputs to generate predictions on time series data. We introduce the concept of modifying the cell state (memory) of LSTMs using rotation matrices parametrised by a new set of trainable weights. This addition shows significant increases of performance on some of the tasks from the bAbI dataset.

Tasks

Reproductions