SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 35713580 of 4002 papers

TitleStatusHype
Evaluating Biased Attitude Associations of Language Models in an Intersectional ContextCode0
Evaluating Bias In Dutch Word EmbeddingsCode0
Evaluating bilingual word embeddings on the long tailCode0
Vector Embedding of Wikipedia Concepts and EntitiesCode0
Semi-Supervised Learning for Bilingual Lexicon InductionCode0
Ontology-Aware Token Embeddings for Prepositional Phrase AttachmentCode0
Word-Level Loss Extensions for Neural Temporal Relation ClassificationCode0
A Survey on Contextualised Semantic Shift DetectionCode0
Text classification with word embedding regularization and soft similarity measureCode0
Transparent, Efficient, and Robust Word Embedding Access with WOMBATCode0
Show:102550
← PrevPage 358 of 401Next →

No leaderboard results yet.