SOTAVerified

The impact of domain-specific representations on BERT-based multi-domain spoken language understanding

2021-04-01EACL (AdaptNLP) 2021Unverified0· sign in to hype

Judith Gaspers, Quynh Do, Tobias Röding, Melanie Bradford

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This paper provides the first experimental study on the impact of using domain-specific representations on a BERT-based multi-task spoken language understanding (SLU) model for multi-domain applications. Our results on a real-world dataset covering three languages indicate that by using domain-specific representations learned adversarially, model performance can be improved across all of the three SLU subtasks domain classification, intent classification and slot filling. Gains are particularly large for domains with limited training data.

Tasks

Reproductions