SOTAVerified

A Syntactic Neural Model for General-Purpose Code Generation

2017-04-06ACL 2017Code Available0· sign in to hype

Pengcheng Yin, Graham Neubig

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We consider the problem of parsing natural language descriptions into source code written in a general-purpose programming language like Python. Existing data-driven methods treat this problem as a language generation task without considering the underlying syntax of the target programming language. Informed by previous work in semantic parsing, in this paper we propose a novel neural architecture powered by a grammar model to explicitly capture the target syntax as prior knowledge. Experiments find this an effective way to scale up to generation of complex programs from natural language descriptions, achieving state-of-the-art results that well outperform previous code generation and semantic parsing approaches.

Tasks

Reproductions