SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 28812890 of 4002 papers

TitleStatusHype
MITRE: Seven Systems for Semantic Similarity in Tweets0
Mixed Membership Word Embeddings for Computational Social Science0
Mixing syntagmatic and paradigmatic information for concept detection0
Mixtures of Deep Neural Experts for Automated Speech Scoring0
Model-based Word Embeddings from Decompositions of Count Matrices0
Model Choices Influence Attributive Word Associations: A Semi-supervised Analysis of Static Word Embeddings0
Model-Free Context-Aware Word Composition0
Modeling Context Words as Regions: An Ordinal Regression Approach to Word Embedding0
Modeling Noisiness to Recognize Named Entities using Multitask Neural Networks on Social Media0
Modeling Order in Neural Word Embeddings at Scale0
Show:102550
← PrevPage 289 of 401Next →

No leaderboard results yet.