SOTAVerified

IT--IST at the SIGMORPHON 2019 Shared Task: Sparse Two-headed Models for Inflection

2019-08-01WS 2019Unverified0· sign in to hype

Ben Peters, Andr{\'e} F. T. Martins

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This paper presents the Instituto de Telecomunicac\~oes--Instituto Superior T\'ecnico submission to Task 1 of the SIGMORPHON 2019 Shared Task. Our models combine sparse sequence-to-sequence models with a two-headed attention mechanism that learns separate attention distributions for the lemma and inflectional tags. Among submissions to Task 1, our models rank second and third. Despite the low data setting of the task (only 100 in-language training examples), they learn plausible inflection patterns and often concentrate all probability mass into a small set of hypotheses, making beam search exact.

Tasks

Reproductions