SOTAVerified

Neural Temporality Adaptation for Document Classification: Diachronic Word Embeddings and Domain Adaptation Models

2019-07-01ACL 2019Code Available0· sign in to hype

Xiaolei Huang, Michael J. Paul

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Language usage can change across periods of time, but document classifiers models are usually trained and tested on corpora spanning multiple years without considering temporal variations. This paper describes two complementary ways to adapt classifiers to shifts across time. First, we show that diachronic word embeddings, which were originally developed to study language change, can also improve document classification, and we show a simple method for constructing this type of embedding. Second, we propose a time-driven neural classification model inspired by methods for domain adaptation. Experiments on six corpora show how these methods can make classifiers more robust over time.

Tasks

Reproductions