SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 461470 of 4002 papers

TitleStatusHype
MatNexus: A Comprehensive Text Mining and Analysis Suite for Materials Discover0
Explainable Identification of Hate Speech towards Islam using Graph Neural Networks0
An Embedded Diachronic Sense Change Model with a Case Study from Ancient GreekCode0
Evaluation Framework for Understanding Sensitive Attribute Association Bias in Latent Factor Recommendation Algorithms0
ProMap: Effective Bilingual Lexicon Induction via Language Model PromptingCode0
Do Not Harm Protected Groups in Debiasing Language Representation Models0
Analogical Proportions and Creativity: A Preliminary Study0
GARI: Graph Attention for Relative Isomorphism of Arabic Word EmbeddingsCode0
ChatGPT-guided Semantics for Zero-shot LearningCode0
Enhancing Interpretability using Human Similarity Judgements to Prune Word Embeddings0
Show:102550
← PrevPage 47 of 401Next →

No leaderboard results yet.