SOTAVerified

Extracting General-use Transformers for Low-resource Languages via Knowledge Distillation

2025-01-22Unverified0· sign in to hype

Jan Christian Blaise Cruz, Alham Fikri Aji

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In this paper, we propose the use of simple knowledge distillation to produce smaller and more efficient single-language transformers from Massively Multilingual Transformers (MMTs) to alleviate tradeoffs associated with the use of such in low-resource settings. Using Tagalog as a case study, we show that these smaller single-language models perform on-par with strong baselines in a variety of benchmark tasks in a much more efficient manner. Furthermore, we investigate additional steps during the distillation process that improves the soft-supervision of the target language, and provide a number of analyses and ablations to show the efficacy of the proposed method.

Tasks

Reproductions