SOTAVerified

TLDR at SemEval-2022 Task 1: Using Transformers to Learn Dictionaries and Representations

2022-07-01SemEval (NAACL) 2022Unverified0· sign in to hype

Aditya Srivastava, Harsha Vardhan Vemulapati

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We propose a pair of deep learning models, which employ unsupervised pretraining, attention mechanisms and contrastive learning for representation learning from dictionary definitions, and definition modeling from such representations. Our systems, the Transformers for Learning Dictionaries and Representations (TLDR), were submitted to the SemEval 2022 Task 1: Comparing Dictionaries and Word Embeddings (CODWOE), where they officially ranked first on the definition modeling subtask, and achieved competitive performance on the reverse dictionary subtask. In this paper we describe our methodology and analyse our system design hypotheses.

Tasks

Reproductions