SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 221230 of 4002 papers

TitleStatusHype
Understanding Linearity of Cross-Lingual Word Embedding MappingsCode1
SA2SL: From Aspect-Based Sentiment Analysis to Social Listening System for Business IntelligenceCode1
Sentiment Word Aware Multimodal Refinement for Multimodal Sentiment Analysis with ASR ErrorsCode1
Shortformer: Better Language Modeling using Shorter InputsCode1
SimAlign: High Quality Word Alignments without Parallel Training Data using Static and Contextualized EmbeddingsCode1
Simple, Interpretable and Stable Method for Detecting Words with Usage Change across CorporaCode1
Statistical Uncertainty in Word Embeddings: GloVe-VCode1
Structured Pruning of Large Language ModelsCode1
The Looming Threat of Fake and LLM-generated LinkedIn Profiles: Challenges and Opportunities for Detection and PreventionCode1
Embed2Detect: Temporally Clustered Embedded Words for Event Detection in Social MediaCode1
Show:102550
← PrevPage 23 of 401Next →

No leaderboard results yet.