SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 27012725 of 4002 papers

TitleStatusHype
WEAC: Word embeddings for anomaly classification from event logs0
Weakly-Supervised Concept-based Adversarial Learning for Cross-lingual Word Embeddings0
Weakly Supervised Cross-Lingual Named Entity Recognition via Effective Annotation and Representation Projection0
Weakly Supervised Few-shot Object Segmentation using Co-Attention with Visual and Semantic Embeddings0
WECHSEL: Effective initialization of subword embeddings for cross-lingual transfer of monolingual language models0
Well-calibrated Confidence Measures for Multi-label Text Classification with a Large Number of Labels0
WEmbSim: A Simple yet Effective Metric for Image Captioning0
What Analogies Reveal about Word Vectors and their Compositionality0
What are the biases in my word embedding?0
What can you do with a rock? Affordance extraction via word embeddings0
What company do words keep? Revisiting the distributional semantics of J.R. Firth & Zellig Harris0
What does Neural Bring? Analysing Improvements in Morphosyntactic Annotation and Lemmatisation of Slovenian, Croatian and Serbian0
What Does This Word Mean? Explaining Contextualized Embeddings with Natural Language Definition0
What do we need to know about an unknown word when parsing German0
What do you mean, BERT? Assessing BERT as a Distributional Semantics Model0
What makes multilingual BERT multilingual?0
What's in a Name? Reducing Bias in Bios without Access to Protected Attributes0
What's in an Embedding? Analyzing Word Embeddings through Multilingual Evaluation0
What's in Your Embedding, And How It Predicts Task Performance0
What the Vec? Towards Probabilistically Grounded Embeddings0
When Hyperparameters Help: Beneficial Parameter Combinations in Distributional Semantic Models0
When Polysemy Matters: Modeling Semantic Categorization with Word Embeddings0
When Specialization Helps: Using Pooled Contextualized Embeddings to Detect Chemical and Biomedical Entities in Spanish0
When Word Embeddings Become Endangered0
Where exactly does contextualization in a PLM happen?0
Show:102550
← PrevPage 109 of 161Next →

No leaderboard results yet.