SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 27712780 of 4002 papers

TitleStatusHype
Simplifying Sentences with Sequence to Sequence Models0
Unsupervised Learning of Style-sensitive Word Vectors0
Unsupervised Abstractive Meeting Summarization with Multi-Sentence Compression and Budgeted Submodular MaximizationCode0
Effects of Word Embeddings on Neural Network-based Pitch Accent Detection0
New Embedded Representations and Evaluation Protocols for Inferring Transitive Relations0
Analogical Reasoning on Chinese Morphological and Semantic RelationsCode0
Domain Adapted Word Embeddings for Improved Sentiment ClassificationCode0
NRC-Canada at SMM4H Shared Task: Classifying Tweets Mentioning Adverse Drug Reactions and Medication Intake0
Learning Domain-Sensitive and Sentiment-Aware Word Embeddings0
Incorporating Subword Information into Matrix Factorization Word EmbeddingsCode0
Show:102550
← PrevPage 278 of 401Next →

No leaderboard results yet.