SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 19912000 of 4002 papers

TitleStatusHype
Evaluation of Morphological Embeddings for the Russian Language0
Joint Learning of Hierarchical Word Embeddings from a Corpus and a Taxonomy0
Evaluation of Greek Word Embeddings0
Joint Learning of Word and Label Embeddings for Sequence Labelling in Spoken Language Understanding0
Jointly Learning to Embed and Predict with Multiple Languages0
JOINTLY LEARNING TOPIC SPECIFIC WORD AND DOCUMENT EMBEDDING0
Classification of Micro-Texts Using Sub-Word Embeddings0
Jointly modelling the evolution of social structure and language in online communities0
A Review of Standard Text Classification Practices for Multi-label Toxicity Identification of Online Content0
A Locally Linear Procedure for Word Translation0
Show:102550
← PrevPage 200 of 401Next →

No leaderboard results yet.