SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 17211730 of 4002 papers

TitleStatusHype
A Deep Learning Approach to Behavior-Based Learner Modeling0
Zero-Shot Activity Recognition with Videos0
Generating Sense Embeddings for Syntactic and Semantic Analogy for PortugueseCode0
Text-based inference of moral sentiment change0
A Common Semantic Space for Monolingual and Cross-Lingual Meta-EmbeddingsCode0
DSR: A Collection for the Evaluation of Graded Disease-Symptom Relations0
Humpty Dumpty: Controlling Word Meanings via Corpus Poisoning0
Balancing the composition of word embeddings across heterogenous data sets0
Visual Storytelling via Predicting Anchor Word Embeddings in the Stories0
On the Replicability of Combining Word Embeddings and Retrieval Models0
Show:102550
← PrevPage 173 of 401Next →

No leaderboard results yet.