SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 20412050 of 4002 papers

TitleStatusHype
Domain Adaptation for Named Entity Recognition in Online Media with Word Embeddings0
Domain adaptation for part-of-speech tagging of noisy user-generated text0
Domain Disentangled Generative Adversarial Network for Zero-Shot Sketch-Based 3D Shape Retrieval0
Comparative analysis of word embeddings in assessing semantic similarity of complex sentences0
Do Not Harm Protected Groups in Debiasing Language Representation Models0
Do not neglect related languages: The case of low-resource Occitan cross-lingual word embeddings0
Don't Forget Cheap Training Signals Before Building Unsupervised Bilingual Word Embeddings0
Don’t Forget Cheap Training Signals Before Building Unsupervised Bilingual Word Embeddings0
Don’t Forget Cheap Training Signals Before Building Unsupervised Bilingual Word Embeddings0
Do Nuclear Submarines Have Nuclear Captains? A Challenge Dataset for Commonsense Reasoning over Adjectives and Objects0
Show:102550
← PrevPage 205 of 401Next →

No leaderboard results yet.