SOTAVerified

Head-First Linearization with Tree-Structured Representation

2019-10-01WS 2019Unverified0· sign in to hype

Xiang Yu, Agnieszka Falenska, Ngoc Thang Vu, Jonas Kuhn

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We present a dependency tree linearization model with two novel components: (1) a tree-structured encoder based on bidirectional Tree-LSTM that propagates information first bottom-up then top-down, which allows each token to access information from the entire tree; and (2) a linguistically motivated head-first decoder that emphasizes the central role of the head and linearizes the subtree by incrementally attaching the dependents on both sides of the head. With the new encoder and decoder, we reach state-of-the-art performance on the Surface Realization Shared Task 2018 dataset, outperforming not only the shared tasks participants, but also previous state-of-the-art systems (Bohnet et al., 2011; Puduppully et al., 2016). Furthermore, we analyze the power of the tree-structured encoder with a probing task and show that it is able to recognize the topological relation between any pair of tokens in a tree.

Tasks

Reproductions