SOTAVerified

Inducing Syntactic Trees from BERT Representations

2019-06-27Unverified0· sign in to hype

Rudolf Rosa, David Mareček

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We use the English model of BERT and explore how a deletion of one word in a sentence changes representations of other words. Our hypothesis is that removing a reducible word (e.g. an adjective) does not affect the representation of other words so much as removing e.g. the main verb, which makes the sentence ungrammatical and of "high surprise" for the language model. We estimate reducibilities of individual words and also of longer continuous phrases (word n-grams), study their syntax-related properties, and then also use them to induce full dependency trees.

Tasks

Reproductions