SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 191200 of 4002 papers

TitleStatusHype
Leveraging MLLM Embeddings and Attribute Smoothing for Compositional Zero-Shot LearningCode1
LingJing at SemEval-2022 Task 1: Multi-task Self-supervised Pre-training for Multilingual Reverse DictionaryCode1
GrEmLIn: A Repository of Green Baseline Embeddings for 87 Low-Resource Languages Injected with Multilingual Graph KnowledgeCode1
Machine learning as a model for cultural learning: Teaching an algorithm what it means to be fatCode1
MIANet: Aggregating Unbiased Instance and General Information for Few-Shot Semantic SegmentationCode1
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language ModelsCode1
Modality-Transferable Emotion Embeddings for Low-Resource Multimodal Emotion RecognitionCode1
MorphTE: Injecting Morphology in Tensorized EmbeddingsCode1
Multilingual Jointly Trained Acoustic and Written Word EmbeddingsCode1
Contextualized Embeddings based Transformer Encoder for Sentence Similarity Modeling in Answer Selection TaskCode1
Show:102550
← PrevPage 20 of 401Next →

No leaderboard results yet.