SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 22912300 of 4002 papers

TitleStatusHype
A Simple Regularization-based Algorithm for Learning Cross-Domain Word Embeddings0
Decomposing Generalization: Models of Generic, Habitual, and Episodic Statements0
No Training Required: Exploring Random Encoders for Sentence ClassificationCode0
Evaluating Word Embedding Models: Methods and Experimental Results0
Analogies Explained: Towards Understanding Word Embeddings0
Word Embeddings: A Survey0
MORTY Embedding: Improved Embeddings without Supervision0
Context-Sensitive Malicious Spelling Error Correction0
Equalizing Gender Biases in Neural Machine Translation with Word Embeddings TechniquesCode0
Deconstructing Word Embeddings0
Show:102550
← PrevPage 230 of 401Next →

No leaderboard results yet.