SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 23212330 of 4002 papers

TitleStatusHype
Supervised Word Sense Disambiguation with Sentences Similarities from Context Word Embeddings0
Survey on Automated Short Answer Grading with Deep Learning: from Word Embeddings to Transformers0
SVIP: Semantically Contextualized Visual Patches for Zero-Shot Learning0
SwissCheese at SemEval-2016 Task 4: Sentiment Classification Using an Ensemble of Convolutional Neural Networks with Distant Supervision0
SWOW-8500: Word Association task for Intrinsic Evaluation of Word Embeddings0
Symmetric Pattern Based Word Embeddings for Improved Word Similarity Prediction0
Symmetric Patterns and Coordinations: Fast and Enhanced Representations of Verbs and Adjectives0
Synergizing Unsupervised and Supervised Learning: A Hybrid Approach for Accurate Natural Language Task Modeling0
Synonym Discovery with Etymology-based Word Embeddings0
Syntactic and semantic classification of verb arguments using dependency-based and rich semantic features0
Show:102550
← PrevPage 233 of 401Next →

No leaderboard results yet.