SOTAVerified

Exact Learning of Arithmetic with Differentiable Agents

2025-11-27Code Available0· sign in to hype

Hristo Papazov, Francesco D'Angelo, Nicolas Flammarion

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We explore the possibility of exact algorithmic learning with gradient-based methods and introduce a differentiable framework capable of strong length generalization on arithmetic tasks. Our approach centers on Differentiable Finite-State Transducers (DFSTs), a Turing-complete model family that avoids the pitfalls of prior architectures by enabling constant-precision, constant-time generation, and end-to-end log-parallel differentiable training. Leveraging policy-trajectory observations from expert agents, we train DFSTs to perform binary and decimal addition and multiplication. Remarkably, models trained on tiny datasets generalize without error to inputs thousands of times longer than the training examples. These results show that training differentiable agents on structured intermediate supervision could pave the way towards exact gradient-based learning of algorithmic skills. Code available at https://github.com/dngfra/differentiable-exact-algorithmic-learner.git.

Reproductions