SOTAVerified

Disentangled Knowledge Transfer for OOD Intent Discovery with Unified Contrastive Learning

2022-05-01ACL 2022Code Available1· sign in to hype

Yutao Mou, Keqing He, Yanan Wu, Zhiyuan Zeng, Hong Xu, Huixing Jiang, Wei Wu, Weiran Xu

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Discovering Out-of-Domain(OOD) intents is essential for developing new skills in a task-oriented dialogue system. The key challenge is how to transfer prior IND knowledge to OOD clustering. Different from existing work based on shared intent representation, we propose a novel disentangled knowledge transfer method via a unified multi-head contrastive learning framework. We aim to bridge the gap between IND pre-training and OOD clustering. Experiments and analysis on two benchmark datasets show the effectiveness of our method.

Tasks

Reproductions