SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 231240 of 4002 papers

TitleStatusHype
OFA: A Framework of Initializing Unseen Subword Embeddings for Efficient Large-scale Multilingual Continued PretrainingCode1
Solving ARC visual analogies with neural embeddings and vector arithmetic: A generalized methodCode0
Word Definitions from Large Language Models0
How Abstract Is Linguistic Generalization in Large Language Models? Experiments with Argument StructureCode0
MatNexus: A Comprehensive Text Mining and Analysis Suite for Materials Discover0
Explainable Identification of Hate Speech towards Islam using Graph Neural Networks0
An Embedded Diachronic Sense Change Model with a Case Study from Ancient GreekCode0
Evaluation Framework for Understanding Sensitive Attribute Association Bias in Latent Factor Recommendation Algorithms0
ProMap: Effective Bilingual Lexicon Induction via Language Model PromptingCode0
Do Not Harm Protected Groups in Debiasing Language Representation Models0
Show:102550
← PrevPage 24 of 401Next →

No leaderboard results yet.