Recurrent Neural Network Grammars
2016-02-25NAACL 2016Code Available0· sign in to hype
Chris Dyer, Adhiguna Kuncoro, Miguel Ballesteros, Noah A. Smith
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/clab/rnngOfficialIn papernone★ 0
- github.com/Psarpei/Recognition-of-logical-document-structuresnone★ 0
- github.com/dpfried/rnng-berttf★ 0
- github.com/yv/rnngnone★ 0
- github.com/gofortargets/rnngnone★ 0
- github.com/tempra28/nmtrnngtf★ 0
Abstract
We introduce recurrent neural network grammars, probabilistic models of sentences with explicit phrase structure. We explain efficient inference procedures that allow application to both parsing and language modeling. Experiments show that they provide better parsing in English than any single previously published supervised generative model and better language modeling than state-of-the-art sequential RNNs in English and Chinese.