SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 301310 of 4002 papers

TitleStatusHype
Language Models Implement Simple Word2Vec-style Vector ArithmeticCode1
Sentiment Analysis Using Aligned Word Embeddings for Uralic Languages0
Detecting and Mitigating Indirect Stereotypes in Word Embeddings0
Beyond Shared Vocabulary: Increasing Representational Word Similarities across Languages for Multilingual Machine TranslationCode0
MIANet: Aggregating Unbiased Instance and General Information for Few-Shot Semantic SegmentationCode1
ChatGPT-EDSS: Empathetic Dialogue Speech Synthesis Trained from ChatGPT-derived Context Word Embeddings0
Probing Brain Context-Sensitivity with Masked-Attention Generation0
Word Embeddings Are Steers for Language ModelsCode1
Measuring Intersectional Biases in Historical DocumentsCode0
Controllable Speaking Styles Using a Large Language Model0
Show:102550
← PrevPage 31 of 401Next →

No leaderboard results yet.