SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 13611370 of 4002 papers

TitleStatusHype
Contrastive Visual Semantic Pretraining Magnifies the Semantics of Natural Language Representations0
A Trie-Structured Bayesian Model for Unsupervised Morphological Segmentation0
Continuous Word Embedding Fusion via Spectral Decomposition0
A Transparent Framework for Evaluating Unintended Demographic Bias in Word Embeddings0
A Distribution-based Model to Learn Bilingual Word Embeddings0
Context Vectors are Reflections of Word Vectors in Half the Dimensions0
A Topical Approach to Capturing Customer Insight In Social Media0
Analysis of Word Embeddings Using Fuzzy Clustering0
CG-CNN: Self-Supervised Feature Extraction Through Contextual Guidance and Transfer Learning0
Contextualizing Citations for Scientific Summarization using Word Embeddings and Domain Knowledge0
Show:102550
← PrevPage 137 of 401Next →

No leaderboard results yet.