SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 37813790 of 4002 papers

TitleStatusHype
Making Sense of Word EmbeddingsCode0
FreSaDa: A French Satire Data Set for Cross-Domain Satire DetectionCode0
UMUTeam at SemEval-2021 Task 7: Detecting and Rating Humor and Offense with Linguistic Features and Word EmbeddingsCode0
Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word EmbeddingsCode0
From Hyperbolic Geometry Back to Word EmbeddingsCode0
Mapping distributional to model-theoretic semantic spaces: a baselineCode0
From Incremental Meaning to Semantic Unit (phrase by phrase)Code0
Mapping Supervised Bilingual Word Embeddings from English to low-resource languagesCode0
Profiling Bias in LLMs: Stereotype Dimensions in Contextual Word EmbeddingsCode0
From Paraphrase Database to Compositional Paraphrase Model and BackCode0
Show:102550
← PrevPage 379 of 401Next →

No leaderboard results yet.