SOTAVerified

Does BERT agree? Evaluating knowledge of structure dependence through agreement relations

2019-08-26Code Available0· sign in to hype

Geoff Bacon, Terry Regier

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Learning representations that accurately model semantics is an important goal of natural language processing research. Many semantic phenomena depend on syntactic structure. Recent work examines the extent to which state-of-the-art models for pre-training representations, such as BERT, capture such structure-dependent phenomena, but is largely restricted to one phenomenon in English: number agreement between subjects and verbs. We evaluate BERT's sensitivity to four types of structure-dependent agreement relations in a new semi-automatically curated dataset across 26 languages. We show that both the single-language and multilingual BERT models capture syntax-sensitive agreement patterns well in general, but we also highlight the specific linguistic contexts in which their performance degrades.

Tasks

Reproductions