SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 12811290 of 4002 papers

TitleStatusHype
Contextualized Word Embeddings Encode Aspects of Human-Like Word Sense Knowledge0
Fair Embedding Engine: A Library for Analyzing and Mitigating Gender Bias in Word EmbeddingsCode1
Word Embeddings for Chemical Patent Natural Language Processing0
Anchor-based Bilingual Word Embeddings for Low-Resource Languages0
Topic Modeling with Contextualized Word Representation Clusters0
Generating Adequate Distractors for Multiple-Choice Questions0
Applying Occam's Razor to Transformer-Based Dependency Parsing: What Works, What Doesn't, and What is Really NecessaryCode1
Dynamic Contextualized Word EmbeddingsCode1
Comparative analysis of word embeddings in assessing semantic similarity of complex sentences0
FAME: Feature-Based Adversarial Meta-Embeddings for Robust Input RepresentationsCode1
Show:102550
← PrevPage 129 of 401Next →

No leaderboard results yet.