SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 30713080 of 4002 papers

TitleStatusHype
Improving neural tagging with lexical information0
Discovering Stylistic Variations in Distributional Vector Space Models via Lexical Paraphrases0
Automatic Community Creation for Abstractive Spoken Conversations Summarization0
Evaluation of word embeddings against cognitive processes: primed reaction times in lexical decision and naming tasks0
Prepositional Phrase Attachment over Word Embedding Products0
IITP at EmoInt-2017: Measuring Intensity of Emotions using Sentence Embeddings and Optimized Features0
MEANT 2.0: Accurate semantic MT evaluation for any output language0
The strange geometry of skip-gram with negative sampling0
A Question Answering Approach for Emotion Cause Extraction0
Deriving continous grounded meaning representations from referentially structured multimodal contexts0
Show:102550
← PrevPage 308 of 401Next →

No leaderboard results yet.