SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 141150 of 4002 papers

TitleStatusHype
E2Vec: Feature Embedding with Temporal Information for Analyzing Student Actions in E-Book SystemsCode0
Talk to Parallel LiDARs: A Human-LiDAR Interaction Method Based on 3D Visual Grounding0
A Novel Cartography-Based Curriculum Learning Method Applied on RoNLI: The First Romanian Natural Language Inference CorpusCode0
Exploring Public Attention in the Circular Economy through Topic Modelling with Twin Hyperparameter OptimisationCode0
"Hunt Takes Hare": Theming Games Through Game-Word Vector Translation0
LGDE: Local Graph-based Dictionary ExpansionCode0
A Comprehensive Analysis of Static Word Embeddings for TurkishCode1
AnomalyLLM: Few-shot Anomaly Edge Detection for Dynamic Graphs using Large Language ModelsCode1
Span-Aggregatable, Contextualized Word Embeddings for Effective Phrase Mining0
Word-specific tonal realizations in Mandarin0
Show:102550
← PrevPage 15 of 401Next →

No leaderboard results yet.