SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 18011810 of 4002 papers

TitleStatusHype
Implicit Phenomena in Short-answer Scoring Data0
Implicit Subjective and Sentimental Usages in Multi-sense Word Embeddings0
Improving Zero Shot Learning Baselines with Commonsense Knowledge0
Importance of Self-Attention for Sentiment Analysis0
Indigenous Language Revitalization and the Dilemma of Gender Bias0
Improved and Robust Controversy Detection in General Web Pages Using Semantic Approaches under Large Scale Conditions0
Des pseudo-sens pour am\'eliorer l'extraction de synonymes \`a partir de plongements lexicaux (Pseudo-senses for improving the extraction of synonyms from word embeddings)0
Cooperative Semi-Supervised Transfer Learning of Machine Reading Comprehension0
Improved CCG Parsing with Semi-supervised Supertagging0
Designing a Russian Idiom-Annotated Corpus0
Show:102550
← PrevPage 181 of 401Next →

No leaderboard results yet.