SOTAVerified

Fill the GAP: Exploiting BERT for Pronoun Resolution

2019-08-01WS 2019Code Available0· sign in to hype

Kai-Chou Yang, Timothy Niven, Tzu Hsuan Chou, Hung-Yu Kao

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

In this paper, we describe our entry in the gendered pronoun resolution competition which achieved fourth place without data augmentation. Our method is an ensemble system of BERTs which resolves co-reference in an interaction space. We report four insights from our work: BERT's representations involve significant redundancy; modeling interaction effects similar to natural language inference models is useful for this task; there is an optimal BERT layer to extract representations for pronoun resolution; and the difference between the attention weights from the pronoun to the candidate entities was highly correlated with the correct label, with interesting implications for future work.

Tasks

Reproductions