SOTAVerified

ProTo: Program-Guided Transformer for Program-Guided Tasks

2021-10-02NeurIPS 2021Code Available1· sign in to hype

Zelin Zhao, Karan Samel, Binghong Chen, Le Song

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Programs, consisting of semantic and structural information, play an important role in the communication between humans and agents. Towards learning general program executors to unify perception, reasoning, and decision making, we formulate program-guided tasks which require learning to execute a given program on the observed task specification. Furthermore, we propose the Program-guided Transformer (ProTo), which integrates both semantic and structural guidance of a program by leveraging cross-attention and masked self-attention to pass messages between the specification and routines in the program. ProTo executes a program in a learned latent space and enjoys stronger representation ability than previous neural-symbolic approaches. We demonstrate that ProTo significantly outperforms the previous state-of-the-art methods on GQA visual reasoning and 2D Minecraft policy learning datasets. Additionally, ProTo demonstrates better generalization to unseen, complex, and human-written programs.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
GQA test-stdProToAccuracy65.14Unverified

Reproductions