SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 25612570 of 4002 papers

TitleStatusHype
RuSentiment: An Enriched Sentiment Analysis Dataset for Social Media in Russian0
Joint Learning from Labeled and Unlabeled Data for Information Retrieval0
From Text to Lexicon: Bridging the Gap between Word Embeddings and Lexical ResourcesCode0
Relation Induction in Word Embeddings Revisited0
A New Approach to Animacy Detection0
Transition-based Neural RST Parsing with Implicit Syntax FeaturesCode0
Recognizing Humour using Word Associations and Humour Anchor Extraction0
Learning Word Meta-Embeddings by AutoencodingCode0
Summarization Evaluation in the Absence of Human Model Summaries Using the Compositionality of Word Embeddings0
Enriching Word Embeddings with Domain Knowledge for Readability Assessment0
Show:102550
← PrevPage 257 of 401Next →

No leaderboard results yet.