Solving Arithmetic Word Problems Using Transformer and Pre-processing of Problem Texts
2020-12-01ICON 2020Unverified0· sign in to hype
Kaden Griffith, Jugal Kalita
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
This paper outlines the use of Transformer networks trained to translate math word problems to equivalent arithmetic expressions in infix, prefix, and postfix notations. We compare results produced by a large number of neural configurations and find that most configurations outperform previously reported approaches on three of four datasets with significant increases in accuracy of over 20 percentage points. The best neural approaches boost accuracy by 30% on average when compared to the previous state-of-the-art.