SOTAVerified

What Does This Word Mean? Explaining Contextualized Embeddings with Natural Language Definition

2019-11-01IJCNLP 2019Unverified0· sign in to hype

Ting-Yun Chang, Yun-Nung Chen

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Contextualized word embeddings have boosted many NLP tasks compared with traditional static word embeddings. However, the word with a specific sense may have different contextualized embeddings due to its various contexts. To further investigate what contextualized word embeddings capture, this paper analyzes whether they can indicate the corresponding sense definitions and proposes a general framework that is capable of explaining word meanings given contextualized word embeddings for better interpretation. The experiments show that both ELMo and BERT embeddings can be well interpreted via a readable textual form, and the findings may benefit the research community for a better understanding of what the embeddings capture.

Tasks

Reproductions