SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 37213730 of 4002 papers

TitleStatusHype
Factors Influencing the Surprising Instability of Word EmbeddingsCode0
CILex: An Investigation of Context Information for Lexical Substitution MethodsCode0
An Empirical Evaluation of doc2vec with Practical Insights into Document Embedding GenerationCode0
Fair is Better than Sensational:Man is to Doctor as Woman is to DoctorCode0
UdL at SemEval-2017 Task 1: Semantic Textual Similarity Estimation of English Sentence Pairs Using Regression Model over Pairwise FeaturesCode0
Local Word Vectors Guiding Keyphrase ExtractionCode0
A Comprehensive Comparison of Word Embeddings in Event & Entity Coreference Resolution.Code0
Fast and Robust Comparison of Probability Measures in Heterogeneous SpacesCode0
The Lifted Matrix-Space Model for Semantic CompositionCode0
Churn Intent Detection in Multilingual Chatbot Conversations and Social MediaCode0
Show:102550
← PrevPage 373 of 401Next →

No leaderboard results yet.