SOTAVerified

Cardiff University at SemEval-2020 Task 6: Fine-tuning BERT for Domain-Specific Definition Classification

2020-12-01SEMEVALUnverified0· sign in to hype

Shelan Jeawak, Luis Espinosa-Anke, Steven Schockaert

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We describe the system submitted to SemEval-2020 Task 6, Subtask 1. The aim of this subtask is to predict whether a given sentence contains a definition or not. Unsurprisingly, we found that strong results can be achieved by fine-tuning a pre-trained BERT language model. In this paper, we analyze the performance of this strategy. Among others, we show that results can be improved by using a two-step fine-tuning process, in which the BERT model is first fine-tuned on the full training set, and then further specialized towards a target domain.

Tasks

Reproductions