SOTAVerified

Discrete Word Embedding for Logical Natural Language Understanding

2020-08-26Unverified0· sign in to hype

Masataro Asai, Zilu Tang

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We propose an unsupervised neural model for learning a discrete embedding of words. Unlike existing discrete embeddings, our binary embedding supports vector arithmetic operations similar to continuous embeddings. Our embedding represents each word as a set of propositional statements describing a transition rule in classical/STRIPS planning formalism. This makes the embedding directly compatible with symbolic, state of the art classical planning solvers.

Tasks

Reproductions