SOTAVerified

AutoCoder: Leveraging Transformers for Automatic Code Synthesis

2021-10-08NeurIPS Workshop AIPLANS 2021Unverified0· sign in to hype

Mrinal Anand, Pratik Kayal, Mayank Singh

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Program synthesis from natural language descriptions is a challenging task. This paper explores two variants of transformer models for the task of program synthesis and showcase higher performance than the existing SOTA models. Through the end, we also discuss the differences in learned representation in these two variants. We demonstrate that the vanilla transformer model has a higher capacity to memorize the training data as compared to the other variant.

Tasks

Reproductions