Holophrasm: a neural Automated Theorem Prover for higher-order logic
2016-08-08Code Available0· sign in to hype
Daniel Whalen
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/dwhalen/holophrasmOfficialnone★ 0
- github.com/jinpz/refactorpytorch★ 8
- github.com/justin941208/SPIA-Projectnone★ 0
- github.com/giomasce/mmppnone★ 0
Abstract
I propose a system for Automated Theorem Proving in higher order logic using deep learning and eschewing hand-constructed features. Holophrasm exploits the formalism of the Metamath language and explores partial proof trees using a neural-network-augmented bandit algorithm and a sequence-to-sequence model for action enumeration. The system proves 14% of its test theorems from Metamath's set.mm module.
Tasks
Benchmark Results
| Dataset | Model | Metric | Claimed | Verified | Status |
|---|---|---|---|---|---|
| Metamath set.mm | Holophrasm | Percentage correct | 14.3 | — | Unverified |