SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 18111820 of 4002 papers

TitleStatusHype
Improved Neural Network-based Multi-label Classification with Better Initialization Leveraging Label Co-occurrence0
Exploiting Common Characters in Chinese and Japanese to Learn Cross-Lingual Word Embeddings via Matrix Factorization0
Clustering is Efficient for Approximate Maximum Inner Product Search0
Exploiting Class Labels to Boost Performance on Embedding-based Text Classification0
Clustering Comparable Corpora of Russian and Ukrainian Academic Texts: Word Embeddings and Semantic Fingerprints0
Improved Word Embeddings with Implicit Structure Information0
Explaining Word Embeddings via Disentangled Representation0
Explaining the Trump Gap in Social Distancing Using COVID Discourse0
Explaining and Generalizing Skip-Gram through Exponential Family Principal Component Analysis0
Explainable Semantic Space by Grounding Language to Vision with Cross-Modal Contrastive Learning0
Show:102550
← PrevPage 182 of 401Next →

No leaderboard results yet.