SOTAVerified

System description for ProfNER - SMMH: Optimized finetuning of a pretrained transformer and word vectors

2021-06-01NAACL (SMM4H) 2021Unverified0· sign in to hype

David Carreto Fidalgo, Daniel Vila-Suero, Francisco Aranda Montes, Ignacio Talavera Cepeda

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This shared task system description depicts two neural network architectures submitted to the ProfNER track, among them the winning system that scored highest in the two sub-tasks 7a and 7b. We present in detail the approach, preprocessing steps and the architectures used to achieve the submitted results, and also provide a GitHub repository to reproduce the scores. The winning system is based on a transformer-based pretrained language model and solves the two sub-tasks simultaneously.

Tasks

Reproductions