SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 22612270 of 4002 papers

TitleStatusHype
SoftMatcha: A Soft and Fast Pattern Matcher for Billion-Scale Corpus Searches0
Softmax Bottleneck Makes Language Models Unable to Represent Multi-mode Word Distributions0
Softmax Bottleneck Makes Language Models Unable to Represent Multi-mode Word Distributions0
Solving Data Sparsity for Aspect Based Sentiment Analysis Using Cross-Linguality and Multi-Linguality0
SOS: Systematic Offensive Stereotyping Bias in Word Embeddings0
SOS: Systematic Offensive Stereotyping Bias in Word Embeddings0
Sound Analogies with Phoneme Embeddings0
SoundChoice: Grapheme-to-Phoneme Models with Semantic Disambiguation0
Sounds Wilde. Phonetically Extended Embeddings for Author-Stylized Poetry Generation0
Sound-Word2Vec: Learning Word Representations Grounded in Sounds0
Show:102550
← PrevPage 227 of 401Next →

No leaderboard results yet.