SOTAVerified

Multi-Task Neural Models for Translating Between Styles Within and Across Languages

2018-06-12COLING 2018Code Available0· sign in to hype

Xing Niu, Sudha Rao, Marine Carpuat

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Generating natural language requires conveying content in an appropriate style. We explore two related tasks on generating text of varying formality: monolingual formality transfer and formality-sensitive machine translation. We propose to solve these tasks jointly using multi-task learning, and show that our models achieve state-of-the-art performance for formality transfer and are able to perform formality-sensitive translation without being explicitly trained on style-annotated translation examples.

Tasks

Reproductions