SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 22312240 of 4002 papers

TitleStatusHype
Siamese Network-Based Supervised Topic Modeling0
Siamese Networks for Inference in Malayalam Language Texts0
SIEVE: Helping Developers Sift Wheat from Chaff via Cross-Platform Analysis0
Signatures of prediction during natural listening in MEG data?0
SimBow at SemEval-2017 Task 3: Soft-Cosine Semantic Similarity between Questions for Community Question Answering0
SimCompass: Using Deep Learning Word Embeddings to Assess Cross-level Similarity0
SimiHawk at SemEval-2016 Task 1: A Deep Ensemble System for Semantic Textual Similarity0
Similarity-Based Reconstruction Loss for Meaning Representation0
SimPA: A Sentence-Level Simplification Corpus for the Public Administration Domain0
Simple Algorithms For Sentiment Analysis On Sentiment Rich, Data Poor Domains.0
Show:102550
← PrevPage 224 of 401Next →

No leaderboard results yet.