Multitask Models for Controlling the Complexity of Neural Machine Translation
2020-07-01WS 2020Unverified0· sign in to hype
Sweta Agrawal, Marine Carpuat
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
We introduce a machine translation task where the output is aimed at audiences of different levels of target language proficiency. We collect a novel dataset of news articles available in English and Spanish and written for diverse reading grade levels. We leverage this dataset to train multitask sequence to sequence models that translate Spanish into English targeted at an easier reading grade level than the original Spanish. We show that multitask models outperform pipeline approaches that translate and simplify text independently.