SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 391400 of 4002 papers

TitleStatusHype
Weakly Supervised Deep Hyperspherical Quantization for Image RetrievalCode0
IITK at SemEval-2024 Task 1: Contrastive Learning and Autoencoders for Semantic Textual Relatedness in Multilingual TextsCode0
BanglaAutoKG: Automatic Bangla Knowledge Graph Construction with Semantic Neural Graph FilteringCode0
Robust Concept Erasure Using Task Vectors0
PejorativITy: Disambiguating Pejorative Epithets to Improve Misogyny Detection in Italian TweetsCode0
Breaking the Silence Detecting and Mitigating Gendered Abuse in Hindi, Tamil, and Indian English Online SpacesCode0
The Shape of Word Embeddings: Quantifying Non-Isometry With Topological Data AnalysisCode0
Quantum Natural Language Processing0
SemRoDe: Macro Adversarial Training to Learn Representations That are Robust to Word-Level AttacksCode0
Fusion approaches for emotion recognition from speech using acoustic and text-based features0
Show:102550
← PrevPage 40 of 401Next →

No leaderboard results yet.