SOTAVerified

Cross-Domain Modeling of Sentence-Level Evidence for Document Retrieval

2019-11-01IJCNLP 2019Unverified0· sign in to hype

Zeynep Akkalyoncu Yilmaz, Wei Yang, Haotian Zhang, Jimmy Lin

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This paper applies BERT to ad hoc document retrieval on news articles, which requires addressing two challenges: relevance judgments in existing test collections are typically provided only at the document level, and documents often exceed the length that BERT was designed to handle. Our solution is to aggregate sentence-level evidence to rank documents. Furthermore, we are able to leverage passage-level relevance judgments fortuitously available in other domains to fine-tune BERT models that are able to capture cross-domain notions of relevance, and can be directly used for ranking news articles. Our simple neural ranking models achieve state-of-the-art effectiveness on three standard test collections.

Tasks

Reproductions