SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 28412850 of 4002 papers

TitleStatusHype
Mapping Unseen Words to Task-Trained Embedding Spaces0
MappSent: a Textual Mapping Approach for Question-to-Question Similarity0
Massive vs. Curated Embeddings for Low-Resourced Languages: the Case of Yor\`ub\'a and Twi0
MatNexus: A Comprehensive Text Mining and Analysis Suite for Materials Discover0
MayoNLP at SemEval 2017 Task 10: Word Embedding Distance Pattern for Keyphrase Classification in Scientific Publications0
MDR Cluster-Debias: A Nonlinear WordEmbedding Debiasing Pipeline0
Meaning at the Planck scale? Contextualized word embeddings for doing history, philosophy, and sociology of science0
Meaning\_space at SemEval-2018 Task 10: Combining explicitly encoded knowledge with information extracted from word embeddings0
MEANT 2.0: Accurate semantic MT evaluation for any output language0
Measure and Evaluation of Semantic Divergence across Two Languages0
Show:102550
← PrevPage 285 of 401Next →

No leaderboard results yet.