SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 14711480 of 4002 papers

TitleStatusHype
Explainable Semantic Space by Grounding Language to Vision with Cross-Modal Contrastive Learning0
Explaining and Generalizing Skip-Gram through Exponential Family Principal Component Analysis0
Explaining the Trump Gap in Social Distancing Using COVID Discourse0
Explaining Word Embeddings via Disentangled Representation0
Clustering Comparable Corpora of Russian and Ukrainian Academic Texts: Word Embeddings and Semantic Fingerprints0
Exploiting Class Labels to Boost Performance on Embedding-based Text Classification0
Exploiting Common Characters in Chinese and Japanese to Learn Cross-Lingual Word Embeddings via Matrix Factorization0
Clustering is Efficient for Approximate Maximum Inner Product Search0
Exploiting Entity BIO Tag Embeddings and Multi-task Learning for Relation Extraction with Imbalanced Data0
Detecting Fake News with Capsule Neural Networks0
Show:102550
← PrevPage 148 of 401Next →

No leaderboard results yet.