SOTAVerified

Self-supervised clarification question generation for ambiguous multi-turn conversation

2021-12-17journal 2021Unverified0· sign in to hype

Shao, Taihua;Cai, Fei;Chen, Wanyu;Chen, Honghui

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Clarification Question Generation (CQG) aims to automatically generate clarification questions to avoid misunderstanding. In this paper, we focus on generating clarification questions in the scenario of ambiguous multi-turn conversation, which can be well applied to the interactive systems, e.g., dialogue systems and conversational recommendation systems. As a novel direction, limited manual-annotated samples are available for CQG. Moreover, existing approaches mainly ignore the representation of ambiguous semantics and cannot deal with the Out-of-Vocabulary (OOV) problem in a good manner. To address the above issues, we propose a Self-supervised Hierarchical Pointer-generator model (SHiP) for this task. In detail, similar to the backbone Coarse-to-fine process of CQG, we first formulate two self-supervised learning pretext tasks, i.e., Dialogue History Prediction and Entity Name Prediction. Then, we incorporate a hierarchical Transformer mechanism and a pointer-generator mechanism to understand the ambiguous multi-turn conversations and solve the OOV problem. Finally, we propose an end-to-end co-training paradigm to train the pretext tasks and downstream tasks. We quantify the improvements of SHiP against the competitive baselines on a publicly available dataset CLAQUA, showing a general improvement of 6.75% and 3.91% over state-of-the-art baseline in terms of BLEU and ROUGE-L, respectively.

Tasks

Reproductions