Multilingual Pre-training with Language and Task Adaptation for Multilingual Text Style Transfer
2022-03-16ACL 2022Code Available0· sign in to hype
Huiyuan Lai, Antonio Toral, Malvina Nissim
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/laihuiyuan/multilingual-tstOfficialIn paperpytorch★ 9
Abstract
We exploit the pre-trained seq2seq model mBART for multilingual text style transfer. Using machine translated data as well as gold aligned English sentences yields state-of-the-art results in the three target languages we consider. Besides, in view of the general scarcity of parallel data, we propose a modular approach for multilingual formality transfer, which consists of two training strategies that target adaptation to both language and task. Our approach achieves competitive performance without monolingual task-specific parallel data and can be applied to other style transfer tasks as well as to other languages.