SOTAVerified

Metaphor Detection Using Contextual Word Embeddings From Transformers

2020-07-01WS 2020Unverified0· sign in to hype

Jerry Liu, Nathan O{'}Hara, Alex Rubin, er, Rachel Draelos, Cynthia Rudin

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

The detection of metaphors can provide valuable information about a given text and is crucial to sentiment analysis and machine translation. In this paper, we outline the techniques for word-level metaphor detection used in our submission to the Second Shared Task on Metaphor Detection. We propose using both BERT and XLNet language models to create contextualized embeddings and a bi-directional LSTM to identify whether a given word is a metaphor. Our best model achieved F1-scores of 68.0\% on VUA AllPOS, 73.0\% on VUA Verbs, 66.9\% on TOEFL AllPOS, and 69.7\% on TOEFL Verbs, placing 7th, 6th, 5th, and 5th respectively. In addition, we outline another potential approach with a KNN-LSTM ensemble model that we did not have enough time to implement given the deadline for the competition. We show that a KNN classifier provides a similar F1-score on a validation set as the LSTM and yields different information on metaphors.

Tasks

Reproductions