SOTAVerified

Stronger Baselines for Grammatical Error Correction Using a Pretrained Encoder-Decoder Model

2020-12-01Asian Chapter of the Association for Computational LinguisticsUnverified0· sign in to hype

Satoru Katsumata, Mamoru Komachi

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Studies on grammatical error correction (GEC) have reported on the effectiveness of pretraining a Seq2Seq model with a large amount of pseudodata. However, this approach requires time-consuming pretraining of GEC because of the size of the pseudodata. In this study, we explored the utility of bidirectional and auto-regressive transformers (BART) as a generic pretrained encoder-decoder model for GEC. With the use of this generic pretrained model for GEC, the time-consuming pretraining can be eliminated. We find that monolingual and multilingual BART models achieve high performance in GEC, with one of the results being comparable to the current strong results in English GEC.

Tasks

Reproductions