SOTAVerified

Chinese Grammatical Correction Using BERT-based Pre-trained Model

2020-11-04Asian Chapter of the Association for Computational LinguisticsUnverified0· sign in to hype

Hongfei Wang, Michiki Kurosawa, Satoru Katsumata, Mamoru Komachi

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In recent years, pre-trained models have been extensively studied, and several downstream tasks have benefited from their utilization. In this study, we verify the effectiveness of two methods that incorporate a BERT-based pre-trained model developed by Cui et al. (2020) into an encoder-decoder model on Chinese grammatical error correction tasks. We also analyze the error type and conclude that sentence-level errors are yet to be addressed.

Tasks

Reproductions