Exploring Transformers in Emotion Recognition: a comparison of BERT, DistillBERT, RoBERTa, XLNet and ELECTRA
2021-04-05Unverified0· sign in to hype
Diogo Cortiz
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
This paper investigates how Natural Language Understanding (NLU) could be applied in Emotion Recognition, a specific task in affective computing. We finetuned different transformers language models (BERT, DistilBERT, RoBERTa, XLNet, and ELECTRA) using a fine-grained emotion dataset and evaluating them in terms of performance (f1-score) and time to complete.