SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 1120 of 4002 papers

TitleStatusHype
ConceptNet at SemEval-2017 Task 2: Extending Word Embeddings with Multilingual Relational KnowledgeCode2
ConceptNet 5.5: An Open Multilingual Graph of General KnowledgeCode2
An Ensemble Method to Produce High-Quality Word Embeddings (2016)Code2
Leveraging MLLM Embeddings and Attribute Smoothing for Compositional Zero-Shot LearningCode1
HJ-Ky-0.1: an Evaluation Dataset for Kyrgyz Word EmbeddingsCode1
Fine-Tuning CLIP's Last Visual Projector: A Few-Shot CornucopiaCode1
Enhancing High-order Interaction Awareness in LLM-based Recommender ModelCode1
GrEmLIn: A Repository of Green Baseline Embeddings for 87 Low-Resource Languages Injected with Multilingual Graph KnowledgeCode1
DiffEditor: Enhancing Speech Editing with Semantic Enrichment and Acoustic ConsistencyCode1
Spiking Convolutional Neural Networks for Text ClassificationCode1
Show:102550
← PrevPage 2 of 401Next →

No leaderboard results yet.