SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 22612270 of 4002 papers

TitleStatusHype
Exploring Word Embeddings for Unsupervised Textual User-Generated Content Normalization0
Exploring Word Sense Disambiguation Abilities of Neural Machine Translation Systems (Non-archival Extended Abstract)0
Exponential Family Embeddings0
Exponential Family Word Embeddings: An Iterative Approach for Learning Word Vectors0
Expressing Objects just like Words: Recurrent Visual Embedding for Image-Text Matching0
Expressivity-aware Music Performance Retrieval using Mid-level Perceptual Features and Emotion Word Embeddings0
Extending and Improving Wordnet via Unsupervised Word Embeddings0
Extending Multi-Sense Word Embedding to Phrases and Sentences for Unsupervised Semantic Applications0
Extending Text Informativeness Measures to Passage Interestingness Evaluation (Language Model vs. Word Embedding)0
Extending WordNet with Fine-Grained Collocational Information via Supervised Distributional Learning0
Show:102550
← PrevPage 227 of 401Next →

No leaderboard results yet.