SOTAVerified

Incremental Neural Coreference Resolution in Constant Memory

2020-04-30EMNLP 2020Code Available0· sign in to hype

Patrick Xia, João Sedoc, Benjamin Van Durme

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We investigate modeling coreference resolution under a fixed memory constraint by extending an incremental clustering algorithm to utilize contextualized encoders and neural components. Given a new sentence, our end-to-end algorithm proposes and scores each mention span against explicit entity representations created from the earlier document context (if any). These spans are then used to update the entity's representations before being forgotten; we only retain a fixed set of salient entities throughout the document. In this work, we successfully convert a high-performing model (Joshi et al., 2020), asymptotically reducing its memory usage to constant space with only a 0.3% relative loss in F1 on OntoNotes 5.0.

Tasks

Reproductions