SOTAVerified

A Structural Transformer with Relative Positions in Trees for Code-to-Sequence Tasks

2020-06-04Unverified0· sign in to hype

Anonymous

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We suggest two approaches to incorporate syntactic information into transformer models encoding trees (e.g. abstract syntax trees) and generating sequences. First, we use self-attention with relative position representations to consider structural relationships between nodes using a representation that encodes movements between any pair of nodes in the tree, and demonstrate how those movements can be computed efficiently on the fly. Second, we suggest an auxiliary loss enforcing the network to predict the lowest common ancestor of node pairs. We apply both methods to source code summarization tasks, where we outperform the state-of-the-art by up to 6% F1. On natural language machine translation, our models yield competitive results, while substantially faster than other. We also consistently outperform sequence-based transformers, and demonstrate that our method yields representations that are more closely aligned with the tree's structure.

Tasks

Reproductions