SOTAVerified

Automatic Grammatical Error Correction for Sequence-to-sequence Text Generation: An Empirical Study

2019-07-01ACL 2019Unverified0· sign in to hype

Tao Ge, Xingxing Zhang, Furu Wei, Ming Zhou

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Sequence-to-sequence (seq2seq) models have achieved tremendous success in text generation tasks. However, there is no guarantee that they can always generate sentences without grammatical errors. In this paper, we present a preliminary empirical study on whether and how much automatic grammatical error correction can help improve seq2seq text generation. We conduct experiments across various seq2seq text generation tasks including machine translation, formality style transfer, sentence compression and simplification. Experiments show the state-of-the-art grammatical error correction system can improve the grammaticality of generated text and can bring task-oriented improvements in the tasks where target sentences are in a formal style.

Tasks

Reproductions