Addressing the Data Sparsity Issue in Neural AMR Parsing
2017-02-16EACL 2017Unverified0· sign in to hype
Xiaochang Peng, Chuan Wang, Daniel Gildea, Nianwen Xue
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Neural attention models have achieved great success in different NLP tasks. How- ever, they have not fulfilled their promise on the AMR parsing task due to the data sparsity issue. In this paper, we de- scribe a sequence-to-sequence model for AMR parsing and present different ways to tackle the data sparsity problem. We show that our methods achieve significant improvement over a baseline neural atten- tion model and our results are also compet- itive against state-of-the-art systems that do not use extra linguistic resources.