Linear algebra with transformers
2021-12-03Code Available1· sign in to hype
François Charton
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/facebookresearch/lawtOfficialIn paperpytorch★ 76
- github.com/ml4sci/symbaheppytorch★ 0
Abstract
Transformers can learn to perform numerical computations from examples only. I study nine problems of linear algebra, from basic matrix operations to eigenvalue decomposition and inversion, and introduce and discuss four encoding schemes to represent real numbers. On all problems, transformers trained on sets of random matrices achieve high accuracies (over 90%). The models are robust to noise, and can generalize out of their training distribution. In particular, models trained to predict Laplace-distributed eigenvalues generalize to different classes of matrices: Wigner matrices or matrices with positive eigenvalues. The reverse is not true.