SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 32513260 of 4002 papers

TitleStatusHype
Improved Word Representation Learning with SememesCode0
Sentence Alignment Methods for Improving Text Simplification Systems0
Skip-Gram − Zipf + Uniform = Vector Additivity0
Exploring Neural Text Simplification ModelsCode0
Enriching Complex Networks with Word Embeddings for Detecting Mild Cognitive Impairment from Speech Transcripts0
Determining Gains Acquired from Word Embedding Quantitatively Using Discrete Distribution Clustering0
A Progressive Learning Approach to Chinese SRL Using Heterogeneous Data0
Apples to Apples: Learning Semantics of Common Entities Through a Novel Comprehension Task0
An Unsupervised Neural Attention Model for Aspect ExtractionCode0
A Multidimensional Lexicon for Interpersonal Stancetaking0
Show:102550
← PrevPage 326 of 401Next →

No leaderboard results yet.