SOTAVerified

daminglu123 at SemEval-2022 Task 2: Using BERT and LSTM to Do Text Classification

2022-07-01SemEval (NAACL) 2022Unverified0· sign in to hype

Daming Lu

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Multiword expressions (MWEs) or idiomaticity are common phenomenon in natural languages. Current pre-trained language models cannot effectively capture the meaning of these MWEs. The reason is that two normal words, after combining together, could have an abruptly different meaning than the compositionality of the meanings of each word, whereas pre-trained language models reply on words compositionality. We proposed an improved method of adding an LSTM layer to the BERT model in order to get better results on a text classification task (Subtask A). Our result is slightly better than the baseline. We also tried adding TextCNN to BERT and adding both LSTM and TextCNN to BERT. We find that adding only LSTM gives the best performance.

Tasks

Reproductions