SOTAVerified

Bigrams and BiLSTMs Two Neural Networks for Sequential Metaphor Detection

2018-06-01WS 2018Code Available0· sign in to hype

Yuri Bizzoni, Mehdi Ghanimifard

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We present and compare two alternative deep neural architectures to perform word-level metaphor detection on text: a bi-LSTM model and a new structure based on recursive feed-forward concatenation of the input. We discuss different versions of such models and the effect that input manipulation - specifically, reducing the length of sentences and introducing concreteness scores for words - have on their performance.

Tasks

Reproductions