SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 671680 of 4002 papers

TitleStatusHype
Decision-Directed Data DecompositionCode0
Evaluating Word Embeddings with Categorical ModularityCode0
Evaluation of sentence embeddings in downstream and linguistic probing tasksCode0
Evaluation Of Word Embeddings From Large-Scale French Web ContentCode0
Categorical Metadata Representation for Customized Text ClassificationCode0
Causally Denoise Word Embeddings Using Half-Sibling RegressionCode0
Caveats of Measuring Semantic Change of Cognates and Borrowings using Multilingual Word EmbeddingsCode0
CBOW Is Not All You Need: Combining CBOW with the Compositional Matrix Space ModelCode0
Aligning Word Vectors on Low-Resource Languages with WiktionaryCode0
DebIE: A Platform for Implicit and Explicit Debiasing of Word Embedding SpacesCode0
Show:102550
← PrevPage 68 of 401Next →

No leaderboard results yet.