SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 27012710 of 4002 papers

TitleStatusHype
WEAC: Word embeddings for anomaly classification from event logs0
Weakly-Supervised Concept-based Adversarial Learning for Cross-lingual Word Embeddings0
Weakly Supervised Cross-Lingual Named Entity Recognition via Effective Annotation and Representation Projection0
Weakly Supervised Few-shot Object Segmentation using Co-Attention with Visual and Semantic Embeddings0
WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models0
Well-calibrated Confidence Measures for Multi-label Text Classification with a Large Number of Labels0
WEmbSim: A Simple yet Effective Metric for Image Captioning0
What Analogies Reveal about Word Vectors and their Compositionality0
What are the biases in my word embedding?0
What can you do with a rock? Affordance extraction via word embeddings0
Show:102550
← PrevPage 271 of 401Next →

No leaderboard results yet.