SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 16111620 of 4002 papers

TitleStatusHype
From Prejudice to Parity: A New Approach to Debiasing Large Language Model Word Embeddings0
From Raw Text to Universal Dependencies - Look, No Tags!0
Comparison between Voting Classifier and Deep Learning methods for Arabic Dialect Identification0
Assessing multiple word embeddings for named entity recognition of professions and occupations in health-related social media0
Comparison of Paragram and GloVe Results for Similarity Benchmarks0
Des repr\'esentations continues de mots pour l'analyse d'opinions en arabe: une \'etude qualitative (Word embeddings for Arabic sentiment analysis : a qualitative study)0
From Word Vectors to Multimodal Embeddings: Techniques, Applications, and Future Directions For Large Language Models0
From Zero to Hero: On the Limitations of Zero-Shot Cross-Lingual Transfer with Multilingual Transformers0
From Zero to Hero: On the Limitations of Zero-Shot Language Transfer with Multilingual Transformers0
An evaluation of Czech word embeddings0
Show:102550
← PrevPage 162 of 401Next →

No leaderboard results yet.