SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 261270 of 4002 papers

TitleStatusHype
An Exploratory Analysis on the Explanatory Potential of Embedding-Based Measures of Semantic Transparency for Malay Word Recognition0
Towards Smart Point-and-Shoot Photography0
Word Embedding Techniques for Classification of Star Ratings0
Word Embeddings Track Social Group Changes Across 70 Years in China0
Geological Inference from Textual Data using Word EmbeddingsCode0
LayerFlow: Layer-wise Exploration of LLM Embeddings using Uncertainty-aware Interlinked Projections0
Investigating and Mitigating Stereotype-aware Unfairness in LLM-based Recommendations0
myNER: Contextualized Burmese Named Entity Recognition with Bidirectional LSTM and fastText Embeddings via Joint Training with POS TaggingCode0
Words as Bridges: Exploring Computational Support for Cross-Disciplinary Translation WorkCode0
Poisson-Process Topic Model for Integrating Knowledge from Pre-trained Language Models0
Show:102550
← PrevPage 27 of 401Next →

No leaderboard results yet.