SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 176200 of 4002 papers

TitleStatusHype
Zero-Shot Semantic SegmentationCode1
All Word Embeddings from One EmbeddingCode1
Improving Bilingual Lexicon Induction with Cross-Encoder RerankingCode1
Improving word mover's distance by leveraging self-attention matrixCode1
Improving Word Translation via Two-Stage Contrastive LearningCode1
Imputing Out-of-Vocabulary Embeddings with LOVE Makes Language Models Robust with Little CostCode1
Imputing Out-of-Vocabulary Embeddings with LOVE Makes LanguageModels Robust with Little CostCode1
iNLTK: Natural Language Toolkit for Indic LanguagesCode1
In Other News: A Bi-style Text-to-speech Model for Synthesizing Newscaster Voice with Limited DataCode1
IRB-NLP at SemEval-2022 Task 1: Exploring the Relationship Between Words and Their Semantic RepresentationsCode1
Is Neural Topic Modelling Better than Clustering? An Empirical Study on Clustering with Contextual Embeddings for TopicsCode1
Keyword-Guided Neural Conversational ModelCode1
Corrected CBOW Performs as well as Skip-gramCode1
Language Modelling Makes Sense: Propagating Representations through WordNet for Full-Coverage Word Sense DisambiguationCode1
Language Models Implement Simple Word2Vec-style Vector ArithmeticCode1
Leveraging MLLM Embeddings and Attribute Smoothing for Compositional Zero-Shot LearningCode1
LingJing at SemEval-2022 Task 1: Multi-task Self-supervised Pre-training for Multilingual Reverse DictionaryCode1
GrEmLIn: A Repository of Green Baseline Embeddings for 87 Low-Resource Languages Injected with Multilingual Graph KnowledgeCode1
Machine learning as a model for cultural learning: Teaching an algorithm what it means to be fatCode1
MIANet: Aggregating Unbiased Instance and General Information for Few-Shot Semantic SegmentationCode1
MirrorWiC: On Eliciting Word-in-Context Representations from Pretrained Language ModelsCode1
Modality-Transferable Emotion Embeddings for Low-Resource Multimodal Emotion RecognitionCode1
MorphTE: Injecting Morphology in Tensorized EmbeddingsCode1
Multilingual Jointly Trained Acoustic and Written Word EmbeddingsCode1
Contextualized Embeddings based Transformer Encoder for Sentence Similarity Modeling in Answer Selection TaskCode1
Show:102550
← PrevPage 8 of 161Next →

No leaderboard results yet.