SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 15011510 of 4002 papers

TitleStatusHype
Aspect-based Opinion Summarization with Convolutional Neural Networks0
A Multi-Domain Framework for Textual Similarity. A Case Study on Question-to-Question and Question-Answering Similarity Tasks0
Comparative study of LSA vs Word2vec embeddings in small corpora: a case study in dreams database0
Comparative Analysis of Static and Contextual Embeddings for Analyzing Semantic Changes in Medieval Latin Charters0
ASOBEK at SemEval-2016 Task 1: Sentence Representation with Character N-gram Embeddings for Semantic Textual Similarity0
A Multidimensional Lexicon for Interpersonal Stancetaking0
A Deep Content-Based Model for Persian Rumor Verification0
Community Evaluation and Exchange of Word Vectors at wordvectors.org0
Ask the GRU: Multi-Task Learning for Deep Text Recommendations0
Coming to Your Senses: on Controls and Evaluation Sets in Polysemy Research0
Show:102550
← PrevPage 151 of 401Next →

No leaderboard results yet.