SOTAVerified

Zero-shot Slot Filling in the Age of LLMs for Dialogue Systems

2024-11-28Unverified0· sign in to hype

Mansi Rana, Kadri Hacioglu, Sindhuja Gopalan, Maragathamani Boothalingam

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Zero-shot slot filling is a well-established subtask of Natural Language Understanding (NLU). However, most existing methods primarily focus on single-turn text data, overlooking the unique complexities of conversational dialogue. Conversational data is highly dynamic, often involving abrupt topic shifts, interruptions, and implicit references that make it difficult to directly apply zero-shot slot filling techniques, even with the remarkable capabilities of large language models (LLMs). This paper addresses these challenges by proposing strategies for automatic data annotation with slot induction and black-box knowledge distillation (KD) from a teacher LLM to a smaller model, outperforming vanilla LLMs on internal datasets by 26% absolute increase in F1 score. Additionally, we introduce an efficient system architecture for call center product settings that surpasses off-the-shelf extractive models by 34% relative F1 score, enabling near real-time inference on dialogue streams with higher accuracy, while preserving low latency.

Tasks

Reproductions