SOTAVerified

Is Encoder-Decoder Transformer the Shiny Hammer?

2022-10-01VarDial (COLING) 2022Unverified0· sign in to hype

Nat Gillin

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We present an approach to multi-class classification using an encoder-decoder transformer model. We trained a network to identify French varieties using the same scripts we use to train an encoder-decoder machine translation model. With some slight modification to the data preparation and inference parameters, we showed that the same tools used for machine translation can be easily re-used to achieve competitive performance for classification. On the French Dialectal Identification (FDI) task, we scored 32.4 on weighted F1, but this is far from a simple naive bayes classifier that outperforms a neural encoder-decoder model at 41.27 weighted F1.

Tasks

Reproductions