SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 191200 of 4002 papers

TitleStatusHype
SimAlign: High Quality Word Alignments without Parallel Training Data using Static and Contextualized EmbeddingsCode1
Null It Out: Guarding Protected Attributes by Iterative Nullspace ProjectionCode1
Compass-aligned Distributional Embeddings for Studying Semantic Differences across CorporaCode1
DeepSentiPers: Novel Deep Learning Models Trained Over Proposed Augmented Persian Sentiment CorpusCode1
Classification Benchmarks for Under-resourced Bengali Language based on Multichannel Convolutional-LSTM NetworkCode1
Information-Theoretic Probing for Linguistic StructureCode1
Understanding Linearity of Cross-Lingual Word Embedding MappingsCode1
Cycle Text-To-Image GAN with BERTCode1
Machine learning as a model for cultural learning: Teaching an algorithm what it means to be fatCode1
Deep Representation Learning of Electronic Health Records to Unlock Patient Stratification at ScaleCode1
Show:102550
← PrevPage 20 of 401Next →

No leaderboard results yet.