SOTAVerified

InFoBERT: Zero-Shot Approach to Natural Language Understanding Using Contextualized Word Embedding

2021-09-01RANLP 2021Unverified0· sign in to hype

Pavel Burnyshev, Andrey Bout, Valentin Malykh, Irina Piontkovskaya

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Natural language understanding is an important task in modern dialogue systems. It becomes more important with the rapid extension of the dialogue systems’ functionality. In this work, we present an approach to zero-shot transfer learning for the tasks of intent classification and slot-filling based on pre-trained language models. We use deep contextualized models feeding them with utterances and natural language descriptions of user intents to get text embeddings. These embeddings then used by a small neural network to produce predictions for intent and slot probabilities. This architecture achieves new state-of-the-art results in two zero-shot scenarios. One is a single language new skill adaptation and another one is a cross-lingual adaptation.

Tasks

Reproductions