SOTAVerified

Gate-Variants of Gated Recurrent Unit (GRU) Neural Networks

2017-01-20Code Available0· sign in to hype

Rahul Dey, Fathi M. Salem

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

The paper evaluates three variants of the Gated Recurrent Unit (GRU) in recurrent neural networks (RNN) by reducing parameters in the update and reset gates. We evaluate the three variant GRU models on MNIST and IMDB datasets and show that these GRU-RNN variant models perform as well as the original GRU RNN model while reducing the computational expense.

Tasks

Reproductions