SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 171180 of 4002 papers

TitleStatusHype
OSCaR: Orthogonal Subspace Correction and Rectification of Biases in Word EmbeddingsCode1
Rethinking Positional Encoding in Language Pre-trainingCode1
Multilingual Jointly Trained Acoustic and Written Word EmbeddingsCode1
Embed2Detect: Temporally Clustered Embedded Words for Event Detection in Social MediaCode1
Detecting Emergent Intersectional Biases: Contextualized Word Embeddings Contain a Distribution of Human-like BiasesCode1
DeCLUTR: Deep Contrastive Learning for Unsupervised Textual RepresentationsCode1
Improved acoustic word embeddings for zero-resource languages using multilingual transferCode1
Nurse is Closer to Woman than Surgeon? Mitigating Gender-Biased Proximities in Word EmbeddingsCode1
Transition-based Semantic Dependency Parsing with Pointer NetworksCode1
Adversarial Training for Commonsense InferenceCode1
Show:102550
← PrevPage 18 of 401Next →

No leaderboard results yet.