SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 30413050 of 4002 papers

TitleStatusHype
MappSent: a Textual Mapping Approach for Question-to-Question Similarity0
Detecting Metaphorical Phrases in the Polish Language0
Gender Prediction for Chinese Social Media Data0
Word Embeddings for Multi-label Document Classification0
Word Embeddings as Features for Supervised Coreference Resolution0
Extracting Tags from Large Raw Texts Using End-to-End Memory Networks0
Fully Delexicalized Contexts for Syntax-Based Word Embeddings0
What do we need to know about an unknown word when parsing German0
Playing with Embeddings : Evaluating embeddings for Robot Language Learning through MUD Games0
Character and Subword-Based Word Representation for Neural Language Modeling Prediction0
Show:102550
← PrevPage 305 of 401Next →

No leaderboard results yet.