SOTAVerified

Learning Dynamic Contextualised Word Embeddings via Template-based Temporal Adaptation

2022-08-23Code Available0· sign in to hype

Xiaohang Tang, Yi Zhou, Danushka Bollegala

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Dynamic contextualised word embeddings (DCWEs) represent the temporal semantic variations of words. We propose a method for learning DCWEs by time-adapting a pretrained Masked Language Model (MLM) using time-sensitive templates. Given two snapshots C_1 and C_2 of a corpus taken respectively at two distinct timestamps T_1 and T_2, we first propose an unsupervised method to select (a) pivot terms related to both C_1 and C_2, and (b) anchor terms that are associated with a specific pivot term in each individual snapshot. We then generate prompts by filling manually compiled templates using the extracted pivot and anchor terms. Moreover, we propose an automatic method to learn time-sensitive templates from C_1 and C_2, without requiring any human supervision. Next, we use the generated prompts to adapt a pretrained MLM to T_2 by fine-tuning using those prompts. Multiple experiments show that our proposed method reduces the perplexity of test sentences in C_2, outperforming the current state-of-the-art.

Tasks

Reproductions