SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 22012210 of 4002 papers

TitleStatusHype
Sentence Embedding Evaluation Using Pyramid Annotation0
Sentence-level Online Handwritten Chinese Character Recognition0
Sentence Modeling via Multiple Word Embeddings and Multi-level Comparison for Semantic Textual Similarity0
Sentence Modeling with Deep Neural Architecture using Lexicon and Character Attention Mechanism for Sentiment Classification0
Sentence Segmentation in Narrative Transcripts from Neuropsychological Tests using Recurrent Convolutional Neural Networks0
Sentence Selection Strategies for Distilling Word Embeddings from BERT0
Sentence Selection Strategies for Distilling Word Embeddings from BERT0
Sentence Similarity Measures for Fine-Grained Estimation of Topical Relevance in Learner Essays0
Senti17 at SemEval-2017 Task 4: Ten Convolutional Neural Network Voters for Tweet Polarity Classification0
Sentiment Analysis by Joint Learning of Word Embeddings and Classifier0
Show:102550
← PrevPage 221 of 401Next →

No leaderboard results yet.