SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 14011410 of 4002 papers

TitleStatusHype
Evaluating Metrics for Bias in Word Embeddings0
Evaluating Monolingual and Crosslingual Embeddings on Datasets of Word Association Norms0
Chemical Identification and Indexing in PubMed Articles via BERT and Text-to-Text Approaches0
Evaluating multi-sense embeddings for semantic resolution monolingually and in word translation0
Evaluating Natural Alpha Embeddings on Intrinsic and Extrinsic Tasks0
Chinese Embedding via Stroke and Glyph Information: A Dual-channel View0
A Framework for Decoding Event-Related Potentials from Text0
Evaluating Off-the-Shelf Machine Listening and Natural Language Models for Automated Audio Captioning0
Chinese Hypernym-Hyponym Extraction from User Generated Categories0
Detecting Policy Preferences and Dynamics in the UN General Debate with Neural Word Embeddings0
Show:102550
← PrevPage 141 of 401Next →

No leaderboard results yet.