SOTAVerified

Accurate Dependency Parsing and Tagging of Latin

2022-06-01LT4HALA (LREC) 2022Unverified0· sign in to hype

Sebastian Nehrdich, Oliver Hellwig

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Having access to high-quality grammatical annotations is important for downstream tasks in NLP as well as for corpus-based research. In this paper, we describe experiments with the Latin BERT word embeddings that were recently be made available by Bamman and Burns (2020). We show that these embeddings produce competitive results in the low-level task morpho-syntactic tagging. In addition, we describe a graph-based dependency parser that is trained with these embeddings and that clearly outperforms various baselines.

Tasks

Reproductions