SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 161170 of 4002 papers

TitleStatusHype
PejorativITy: Disambiguating Pejorative Epithets to Improve Misogyny Detection in Italian TweetsCode0
Breaking the Silence Detecting and Mitigating Gendered Abuse in Hindi, Tamil, and Indian English Online SpacesCode0
DiLM: Distilling Dataset into Language Model for Text-level Dataset DistillationCode1
The Shape of Word Embeddings: Quantifying Non-Isometry With Topological Data AnalysisCode0
Quantum Natural Language Processing0
Debiasing Sentence Embedders through Contrastive Word PairsCode0
Projective Methods for Mitigating Gender Bias in Pre-trained Language ModelsCode0
Fusion approaches for emotion recognition from speech using acoustic and text-based features0
SemRoDe: Macro Adversarial Training to Learn Representations That are Robust to Word-Level AttacksCode0
Introducing Syllable Tokenization for Low-resource Languages: A Case Study with Swahili0
Show:102550
← PrevPage 17 of 401Next →

No leaderboard results yet.