SOTAVerified

Exploiting Pairwise Mutual Information for Knowledge-Grounded Dialogue

2022-03-22IEEE/ACM Transactions on Audio, Speech, and Language Processing 2022Code Available0· sign in to hype

Bo Zhang, Jian Wang, Hongfei Lin, Hui Ma, Bo Xu

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

External document knowledge is helpful for dialogue systems to generate high-quality responses. Although several knowledge-grounded dialogue models have been designed, external knowledge cannot be comprehensively exploited due to the complex relationships among dialogue context, knowledge, and responses. To this end, we propose a novel transformer-based model, named TransIKG, which incorporates external document knowledge for dialogue generation. TransIKG comprises a two-step integration mechanism, including correlation integration and overall integration. Correlation integration is designed to fully exploit the pairwise mutual information among dialogue context, knowledge, and responses, while overall integration adopts an integration gate to capture global information. Furthermore, we utilize the positional information of dialogue turns to better represent the dialogue context and enhance the generalization ability of our model on out-of-domain documents. Finally, we propose a novel knowledge-aware pointer network to generate knowledge-enhanced response tokens. Experimental results on two benchmark datasets demonstrate that our model outperforms state-of-the-art models on both open-domain and domain-specific dialogues.

Tasks

Reproductions