SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 20512060 of 4002 papers

TitleStatusHype
Classification and Clustering of Arguments with Contextualized Word EmbeddingsCode0
Language Modelling Makes Sense: Propagating Representations through WordNet for Full-Coverage Word Sense DisambiguationCode1
Variational Sequential Labelers for Semi-Supervised LearningCode0
Exploiting Entity BIO Tag Embeddings and Multi-task Learning for Relation Extraction with Imbalanced Data0
Learning Bilingual Word Embeddings Using Lexical Definitions0
An Open-World Extension to Knowledge Graph Completion ModelsCode0
Considerations for the Interpretation of Bias Measures of Word Embeddings0
Measuring Bias in Contextualized Word RepresentationsCode1
KaWAT: A Word Analogy Task Dataset for IndonesianCode0
A Structured Distributional Model of Sentence Meaning and Processing0
Show:102550
← PrevPage 206 of 401Next →

No leaderboard results yet.