SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 24712480 of 4002 papers

TitleStatusHype
Towards Entity Spaces0
Towards hate speech detection in low-resource languages: Comparing ASR to acoustic word embeddings on Wolof and Swahili0
Towards High Accuracy Named Entity Recognition for Icelandic0
Towards Lexical Chains for Knowledge-Graph-based Word Embeddings0
Towards Lower Bounds on Number of Dimensions for Word Embeddings0
Towards Optimal Transport with Global Invariances0
Towards Qualitative Word Embeddings Evaluation: Measuring Neighbors Variation0
Towards Resolving Word Ambiguity with Word Embeddings0
Towards Smart Point-and-Shoot Photography0
Towards the Understanding of Gaming Audiences by Modeling Twitch Emotes0
Show:102550
← PrevPage 248 of 401Next →

No leaderboard results yet.