SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 32413250 of 4002 papers

TitleStatusHype
Character-Aware Neural Morphological Disambiguation0
ESTEEM: A Novel Framework for Qualitatively Evaluating and Visualizing Spatiotemporal Embeddings in Social MediaCode0
Temporal Word Analogies: Identifying Lexical Replacement with Diachronic Word EmbeddingsCode0
Varying Linguistic Purposes of Emoji in (Twitter) Context0
Methodical Evaluation of Arabic Word Embeddings0
Obtaining referential word meanings from visual and distributional information: Experiments on object naming0
Neural Joint Model for Transition-based Chinese Syntactic Analysis0
Learning bilingual word embeddings with (almost) no bilingual data0
Information-Theory Interpretation of the Skip-Gram Negative-Sampling Objective Function0
Semantic Word Clusters Using Signed Spectral Clustering0
Show:102550
← PrevPage 325 of 401Next →

No leaderboard results yet.