SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 9911000 of 4002 papers

TitleStatusHype
Object Priors for Classifying and Localizing Unseen ActionsCode0
FreSaDa: A French Satire Data Set for Cross-Domain Satire DetectionCode0
Machine Learning Based on Natural Language Processing to Detect Cardiac Failure in Clinical Narratives0
Probing BERT in Hyperbolic SpacesCode1
Statistically significant detection of semantic shifts using contextual word embeddings0
Revisiting Simple Neural Probabilistic Language ModelsCode1
Combining Pre-trained Word Embeddings and Linguistic Features for Sequential Metaphor Identification0
VERB: Visualizing and Interpreting Bias Mitigation Techniques for Word RepresentationsCode1
Query2Prod2Vec Grounded Word Embeddings for eCommerceCode1
Mining Trends of COVID-19 Vaccine Beliefs on Twitter with Lexical Embeddings0
Show:102550
← PrevPage 100 of 401Next →

No leaderboard results yet.