Contextualized End-to-End Neural Entity Linking
2019-11-10Asian Chapter of the Association for Computational LinguisticsUnverified0· sign in to hype
Haotian Chen, Andrej Zukov-Gregoric, Xi David Li, Sahil Wadhwa
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
We propose yet another entity linking model (YELM) which links words to entities instead of spans. This overcomes any difficulties associated with the selection of good candidate mention spans and makes the joint training of mention detection (MD) and entity disambiguation (ED) easily possible. Our model is based on BERT and produces contextualized word embeddings which are trained against a joint MD and ED objective. We achieve state-of-the-art results on several standard entity linking (EL) datasets.