SOTAVerified

How much complexity does an RNN architecture need to learn syntax-sensitive dependencies?

2020-05-17ACL 2020Code Available0· sign in to hype

Gantavya Bhatt, Hritik Bansal, Rishubh Singh, Sumeet Agarwal

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Long short-term memory (LSTM) networks and their variants are capable of encapsulating long-range dependencies, which is evident from their performance on a variety of linguistic tasks. On the other hand, simple recurrent networks (SRNs), which appear more biologically grounded in terms of synaptic connections, have generally been less successful at capturing long-range dependencies as well as the loci of grammatical errors in an unsupervised setting. In this paper, we seek to develop models that bridge the gap between biological plausibility and linguistic competence. We propose a new architecture, the Decay RNN, which incorporates the decaying nature of neuronal activations and models the excitatory and inhibitory connections in a population of neurons. Besides its biological inspiration, our model also shows competitive performance relative to LSTMs on subject-verb agreement, sentence grammaticality, and language modeling tasks. These results provide some pointers towards probing the nature of the inductive biases required for RNN architectures to model linguistic phenomena successfully.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
WikiText-103LSTMTest perplexity36.4Unverified
WikiText-103LSTMValidation perplexity52.73Unverified
WikiText-103LSTMTest perplexity48.7Unverified
WikiText-103GRUValidation perplexity53.78Unverified
WikiText-103Decay RNNValidation perplexity76.67Unverified

Reproductions