SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 19811990 of 4002 papers

TitleStatusHype
Japanese Lexical Simplification for Non-Native Speakers0
Japanese Word Readability Assessment using Word Embeddings0
Evaluation of Morphological Embeddings for the Russian Language0
JeSemE: Interleaving Semantics and Emotions in a Web Service for the Exploration of Language Change Phenomena0
Evaluation of Greek Word Embeddings0
JHU System Description for the MADAR Arabic Dialect Identification Shared Task0
Classification of Micro-Texts Using Sub-Word Embeddings0
Joint Embeddings of Chinese Words, Characters, and Fine-grained Subcharacter Components0
A Review of Standard Text Classification Practices for Multi-label Toxicity Identification of Online Content0
A Locally Linear Procedure for Word Translation0
Show:102550
← PrevPage 199 of 401Next →

No leaderboard results yet.