Ask2Transformers: Zero-Shot Domain labelling with Pretrained Language Models
2021-01-01EACL (GWC) 2021Code Available0· sign in to hype
Oscar Sainz, German Rigau
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/osainz59/Ask2TransformersOfficialIn paperpytorch★ 152
Abstract
In this paper we present a system that exploits different pre-trained Language Models for assigning domain labels to WordNet synsets without any kind of supervision. Furthermore, the system is not restricted to use a particular set of domain labels. We exploit the knowledge encoded within different off-the-shelf pre-trained Language Models and task formulations to infer the domain label of a particular WordNet definition. The proposed zero-shot system achieves a new state-of-the-art on the English dataset used in the evaluation.