SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 14211430 of 4002 papers

TitleStatusHype
Evaluating Word Embedding Models: Methods and Experimental Results0
Evaluating Word Embeddings for Indonesian--English Code-Mixed Text Based on Synthetic Data0
Detecting Most Frequent Sense using Word Embeddings and BabelNet0
Detecting Metaphorical Phrases in the Polish Language0
Evaluating Word Embeddings for Sentence Boundary Detection in Speech Transcripts0
Evaluating Word Embeddings in Extremely Under-Resourced Languages: A Case Study in Bribri0
CLaC Lab at SemEval-2019 Task 3: Contextual Emotion Detection Using a Combination of Neural Networks and SVM0
Evaluating Word Embeddings on Low-Resource Languages0
Evaluating Word Embeddings Using a Representative Suite of Practical Tasks0
Better Automatic Evaluation of Open-Domain Dialogue Systems with Contextualized Embeddings0
Show:102550
← PrevPage 143 of 401Next →

No leaderboard results yet.