SOTAVerified

Continuous and Interactive Factual Knowledge Learning in Verification Dialogues

2020-10-15NeurIPS Workshop HAMLETS 2020Unverified0· sign in to hype

Anonymous

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Knowledge bases (KBs) used in applications such as dialogue systems need to be continuously expanded in order to serve the users well. This process is known as knowledge base completion (KBC). A piece of knowledge or a fact is often represented as a triple (s, r, t), meaning that the entity s and the entity t have the relation r or are linked by r. KBC builds a model to infer missing facts from the existing ones in a given KB. Existing KBC research typically makes the closed-world assumption that to infer a new fact (s, r, t), it assumes that s, r and t are already in the KB, but are not linked. Clearly, this assumption is a serious limitation. In this paper, we eliminate this assumption and allow s, r and/or t to be unknown to the KB, which we call open-world knowledge base completion (OKBC). We focus on solving OKBC via user interactions, which enables the proposed system to potentially serve as an engine for learning new knowledge during dialogue. Experimental results show the effectiveness of the proposed approach.

Tasks

Reproductions