SOTAVerified

Transformer Semantic Parsing

2020-12-01ALTA 2020Unverified0· sign in to hype

Gabriela Ferraro, Hanna Suominen

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In neural semantic parsing, sentences are mapped to meaning representations using encoder-decoder frameworks. In this paper, we propose to apply the Transformer architecture, instead of recurrent neural networks, to this task. Experiments in two data sets from different domains and with different levels of difficulty show that our model achieved better results than strong baselines in certain settings and competitive results across all our experiments.

Tasks

Reproductions