SOTAVerified

Multiple Sources are Better Than One: Incorporating External Knowledge in Low-Resource Glossing

2024-06-16Code Available0· sign in to hype

Changbing Yang, Garrett Nicolai, Miikka Silfverberg

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

In this paper, we address the data scarcity problem in automatic data-driven glossing for low-resource languages by coordinating multiple sources of linguistic expertise. We supplement models with translations at both the token and sentence level as well as leverage the extensive linguistic capability of modern LLMs. Our enhancements lead to an average absolute improvement of 5%-points in word-level accuracy over the previous state of the art on a typologically diverse dataset spanning six low-resource languages. The improvements are particularly noticeable for the lowest-resourced language Gitksan, where we achieve a 10%-point improvement. Furthermore, in a simulated ultra-low resource setting for the same six languages, training on fewer than 100 glossed sentences, we establish an average 10%-point improvement in word-level accuracy over the previous state-of-the-art system.

Tasks

Reproductions