SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 1120 of 4002 papers

TitleStatusHype
Bridging the Modality Gap: Enhancing Channel Prediction with Semantically Aligned LLMs and Knowledge Distillation0
A Comparative Analysis of Static Word Embeddings for HungarianCode0
An Exploratory Analysis on the Explanatory Potential of Embedding-Based Measures of Semantic Transparency for Malay Word Recognition0
Towards Smart Point-and-Shoot Photography0
Word Embedding Techniques for Classification of Star Ratings0
Word Embeddings Track Social Group Changes Across 70 Years in China0
Geological Inference from Textual Data using Word EmbeddingsCode0
LayerFlow: Layer-wise Exploration of LLM Embeddings using Uncertainty-aware Interlinked Projections0
myNER: Contextualized Burmese Named Entity Recognition with Bidirectional LSTM and fastText Embeddings via Joint Training with POS TaggingCode0
Investigating and Mitigating Stereotype-aware Unfairness in LLM-based Recommendations0
Show:102550
← PrevPage 2 of 401Next →

No leaderboard results yet.