SOTAVerified

TMU Transformer System Using BERT for Re-ranking at BEA 2019 Grammatical Error Correction on Restricted Track

2019-08-01WS 2019Unverified0· sign in to hype

Masahiro Kaneko, Kengo Hotate, Satoru Katsumata, Mamoru Komachi

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We introduce our system that is submitted to the restricted track of the BEA 2019 shared task on grammatical error correction1 (GEC). It is essential to select an appropriate hypothesis sentence from the candidates list generated by the GEC model. A re-ranker can evaluate the naturalness of a corrected sentence using language models trained on large corpora. On the other hand, these language models and language representations do not explicitly take into account the grammatical errors written by learners. Thus, it is not straightforward to utilize language representations trained from a large corpus, such as Bidirectional Encoder Representations from Transformers (BERT), in a form suitable for the learner's grammatical errors. Therefore, we propose to fine-tune BERT on learner corpora with grammatical errors for re-ranking. The experimental results of the W\&I+LOCNESS development dataset demonstrate that re-ranking using BERT can effectively improve the correction performance.

Tasks

Reproductions