SOTAVerified

Numeracy for Language Models: Evaluating and Improving their Ability to Predict Numbers

2018-05-21ACL 2018Code Available0· sign in to hype

Georgios P. Spithourakis, Sebastian Riedel

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Numeracy is the ability to understand and work with numbers. It is a necessary skill for composing and understanding documents in clinical, scientific, and other technical domains. In this paper, we explore different strategies for modelling numerals with language models, such as memorisation and digit-by-digit composition, and propose a novel neural architecture that uses a continuous probability density function to model numerals from an open vocabulary. Our evaluation on clinical and scientific datasets shows that using hierarchical models to distinguish numerals from words improves a perplexity metric on the subset of numerals by 2 and 4 orders of magnitude, respectively, over non-hierarchical models. A combination of strategies can further improve perplexity. Our continuous probability density function model reduces mean absolute percentage errors by 18% and 54% in comparison to the second best strategy for each dataset, respectively.

Tasks

Reproductions