SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 29212930 of 4002 papers

TitleStatusHype
Beyond Word2Vec: Embedding Words and Phrases in Same Vector SpaceCode0
Conditional Generative Adversarial Networks for Emoji Synthesis with Word Embedding Manipulation0
Social Emotion Mining Techniques for Facebook Posts Reaction PredictionCode0
Recognizing Plans by Learning Embeddings from Observed Action Distributions0
AWE-CM Vectors: Augmenting Word Embeddings with a Clinical MetathesaurusCode0
Monolingual Embeddings for Low Resourced Neural Machine TranslationCode0
Leveraging Linguistic Resources for Improving Neural Text Classification0
Context Selection for Embedding ModelsCode0
SentiNLP at IJCNLP-2017 Task 4: Customer Feedback Analysis Using a Bi-LSTM-CNN Model0
YNU-HPCC at IJCNLP-2017 Task 5: Multi-choice Question Answering in Exams Using an Attention-based LSTM Model0
Show:102550
← PrevPage 293 of 401Next →

No leaderboard results yet.