SOTAVerified

Embeddings in Natural Language Processing

2020-12-01COLING 2020Unverified0· sign in to hype

Jose Camacho-Collados, Mohammad Taher Pilehvar

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Embeddings have been one of the most important topics of interest in NLP for the past decade. Representing knowledge through a low-dimensional vector which is easily integrable in modern machine learning models has played a central role in the development of the field. Embedding techniques initially focused on words but the attention soon started to shift to other forms. This tutorial will provide a high-level synthesis of the main embedding techniques in NLP, in the broad sense. We will start by conventional word embeddings (e.g., Word2Vec and GloVe) and then move to other types of embeddings, such as sense-specific and graph alternatives. We will finalize with an overview of the trending contextualized representations (e.g., ELMo and BERT) and explain their potential and impact in NLP.

Tasks

Reproductions