Finnish Language Modeling with Deep Transformer Models
2020-03-14Unverified0· sign in to hype
Abhilash Jain, Aku Ruohe, Stig-Arne Grönroos, Mikko Kurimo
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Transformers have recently taken the center stage in language modeling after LSTM's were considered the dominant model architecture for a long time. In this project, we investigate the performance of the Transformer architectures-BERT and Transformer-XL for the language modeling task. We use a sub-word model setting with the Finnish language and compare it to the previous State of the art (SOTA) LSTM model. BERT achieves a pseudo-perplexity score of 14.5, which is the first such measure achieved as far as we know. Transformer-XL improves upon the perplexity score to 73.58 which is 27\% better than the LSTM model.