SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 21912200 of 4002 papers

TitleStatusHype
Evaluating Word Embeddings Using a Representative Suite of Practical Tasks0
Evaluating word embeddings with fMRI and eye-tracking0
Evaluation Framework for Understanding Sensitive Attribute Association Bias in Latent Factor Recommendation Algorithms0
Evaluation methods for unsupervised word embeddings0
Evaluation of acoustic word embeddings0
Evaluation of Deep Learning Models for Hostility Detection in Hindi Text0
Evaluation of Dictionary Creating Methods for Finno-Ugric Minority Languages0
Evaluation of Domain-specific Word Embeddings using Knowledge Resources0
Evaluation of Greek Word Embeddings0
Evaluation of Morphological Embeddings for the Russian Language0
Show:102550
← PrevPage 220 of 401Next →

No leaderboard results yet.