SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 17711780 of 4002 papers

TitleStatusHype
BERTrade: Using Contextual Embeddings to Parse Old French0
Hypothesis Testing based Intrinsic Evaluation of Word Embeddings0
I2DFormer: Learning Image to Document Attention for Zero-Shot Image Classification0
ICL-HD at SemEval-2016 Task 10: Improving the Detection of Minimal Semantic Units and their Meanings with an Ontology and Word Embeddings0
Continuous Word Embedding Fusion via Spectral Decomposition0
A Distribution-based Model to Learn Bilingual Word Embeddings0
A Transparent Framework for Evaluating Unintended Demographic Bias in Word Embeddings0
Identification of Biased Terms in News Articles by Comparison of Outlet-specific Word Embeddings0
Identification of Indigenous Knowledge Concepts through Semantic Networks, Spelling Tools and Word Embeddings0
BERTMap: A BERT-based Ontology Alignment System0
Show:102550
← PrevPage 178 of 401Next →

No leaderboard results yet.