SOTAVerified

NextLevelBERT: Masked Language Modeling with Higher-Level Representations for Long Documents

2024-02-27Code Available1· sign in to hype

Tamara Czinczoll, Christoph Hönes, Maximilian Schall, Gerard de Melo

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

While (large) language models have significantly improved over the last years, they still struggle to sensibly process long sequences found, e.g., in books, due to the quadratic scaling of the underlying attention mechanism. To address this, we propose NextLevelBERT, a Masked Language Model operating not on tokens, but on higher-level semantic representations in the form of text embeddings. We pretrain NextLevelBERT to predict the vector representation of entire masked text chunks and evaluate the effectiveness of the resulting document vectors on three types of tasks: 1) Semantic Textual Similarity via zero-shot document embeddings, 2) Long document classification, 3) Multiple-choice question answering. We find that next-level Masked Language Modeling is an effective technique to tackle long-document use cases and can outperfor much larger embedding models as long as the required level of detail of semantic information is not too fine. Our models and code are publicly available online.

Tasks

Reproductions