SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 15111520 of 4002 papers

TitleStatusHype
Exploring Vector Spaces for Semantic Relations0
Exploring Wasserstein Distance across Concept Embeddings for Ontology Matching0
Exploring Word Embedding for Drug Name Recognition0
Exploring word embeddings and phonological similarity for the unsupervised correction of language learner errors0
Exploring Word Embeddings for Unsupervised Textual User-Generated Content Normalization0
Exploring Word Sense Disambiguation Abilities of Neural Machine Translation Systems (Non-archival Extended Abstract)0
Exponential Family Embeddings0
Exponential Family Word Embeddings: An Iterative Approach for Learning Word Vectors0
Expressing Objects just like Words: Recurrent Visual Embedding for Image-Text Matching0
BERT Transformer model for Detecting Arabic GPT2 Auto-Generated Tweets0
Show:102550
← PrevPage 152 of 401Next →

No leaderboard results yet.