SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 401410 of 4002 papers

TitleStatusHype
Debiasing Sentence Embedders through Contrastive Word PairsCode0
Projective Methods for Mitigating Gender Bias in Pre-trained Language ModelsCode0
Introducing Syllable Tokenization for Low-resource Languages: A Case Study with Swahili0
Advancing Fake News Detection: Hybrid DeepLearning with FastText and Explainable AI0
A comparative analysis of embedding models for patent similarity0
An efficient domain-independent approach for supervised keyphrase extraction and ranking0
Empowering Segmentation Ability to Multi-modal Large Language ModelsCode0
Leveraging Linguistically Enhanced Embeddings for Open Information Extraction0
Improving Acoustic Word Embeddings through Correspondence Training of Self-supervised Speech RepresentationsCode0
Identifying and interpreting non-aligned human conceptual representations using language modeling0
Show:102550
← PrevPage 41 of 401Next →

No leaderboard results yet.