SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 14111420 of 4002 papers

TitleStatusHype
Chinese Zero Pronoun Resolution with Deep Neural Networks0
Evaluating Sub-word Embeddings in Cross-lingual Models0
Evaluating the Consistency of Word Embeddings from Small Data0
Evaluating the Impact of Sub-word Information and Cross-lingual Word Embeddings on Mi'kmaq Language Modelling0
Detecting Policy Preferences and Dynamics in the UN General Debate with Neural Word Embeddings0
Evaluating the timing and magnitude of semantic change in diachronic word embedding models0
Detecting Paraphrases of Standard Clause Titles in Insurance Contracts0
Evaluating the Underlying Gender Bias in Contextualized Word Embeddings0
Evaluating vector-space models of analogy0
Better Early than Late: Fusing Topics with Word Embeddings for Neural Question Paraphrase Identification0
Show:102550
← PrevPage 142 of 401Next →

No leaderboard results yet.