SOTAVerified

Cross-Lingual Transfer in Zero-Shot Cross-Language Entity Linking

2020-10-19Findings (ACL) 2021Code Available0· sign in to hype

Elliot Schumacher, James Mayfield, Mark Dredze

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Cross-language entity linking grounds mentions in multiple languages to a single-language knowledge base. We propose a neural ranking architecture for this task that uses multilingual BERT representations of the mention and the context in a neural network. We find that the multilingual ability of BERT leads to robust performance in monolingual and multilingual settings. Furthermore, we explore zero-shot language transfer and find surprisingly robust performance. We investigate the zero-shot degradation and find that it can be partially mitigated by a proposed auxiliary training objective, but that the remaining error can best be attributed to domain shift rather than language transfer.

Tasks

Reproductions