SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 221230 of 4002 papers

TitleStatusHype
Where exactly does contextualization in a PLM happen?0
Constructing Vec-tionaries to Extract Message Features from Texts: A Case Study of Moral Appeals0
Unsupervised Approach to Evaluate Sentence-Level Fluency: Do We Really Need Reference?Code0
Robust Concept Erasure via Kernelized Rate-Distortion MaximizationCode0
Quantifying the redundancy between prosody and textCode1
Lego: Learning to Disentangle and Invert Personalized Concepts Beyond Object Appearance in Text-to-Image Diffusion Models0
Multilingual Word Embeddings for Low-Resource Languages using Anchors and a Chain of Related Languages0
Bit Cipher -- A Simple yet Powerful Word Representation System that Integrates Efficiently with Language Models0
Compositional Fusion of Signals in Data Embedding0
Spoken Word2Vec: Learning Skipgram Embeddings from SpeechCode0
Show:102550
← PrevPage 23 of 401Next →

No leaderboard results yet.