Is Domain Adaptation Worth Your Investment? Comparing BERT and FinBERT on Financial Tasks
2021-11-01EMNLP (ECONLP) 2021Unverified0· sign in to hype
Bo Peng, Emmanuele Chersoni, Yu-Yin Hsu, Chu-Ren Huang
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
With the recent rise in popularity of Transformer models in Natural Language Processing, research efforts have been dedicated to the development of domain-adapted versions of BERT-like architectures. In this study, we focus on FinBERT, a Transformer model trained on text from the financial domain. By comparing its performances with the original BERT on a wide variety of financial text processing tasks, we found continual pretraining from the original model to be the more beneficial option. Domain-specific pretraining from scratch, conversely, seems to be less effective.