SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 32713280 of 4002 papers

TitleStatusHype
Scientific document summarization via citation contextualization and scientific discourse0
Context encoders as a simple but powerful extension of word2vecCode0
Insights into Analogy Completion from the Biomedical DomainCode0
Learning Structured Semantic Embeddings for Visual Recognition0
Order embeddings and character-level convolutions for multimodal alignment0
Wordsurf : un outil pour naviguer dans un espace de « Word Embeddings » (Wordsurf : a tool to surf in a ``word embeddings'' space)0
The Mixing method: low-rank coordinate descent for semidefinite programming with diagonal constraintsCode0
Deep Learning for Hate Speech Detection in TweetsCode0
Learning to Compute Word Embeddings On the Fly0
Does the Geometry of Word Embeddings Help Document Classification? A Case Study on Persistent Homology Based Representations0
Show:102550
← PrevPage 328 of 401Next →

No leaderboard results yet.