SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 37313740 of 4002 papers

TitleStatusHype
Effect of dimensionality change on the bias of word embeddings0
Effect of Text Color on Word Embeddings0
Effect of Text Processing Steps on Twitter Sentiment Classification using Word Embedding0
Effects of Creativity and Cluster Tightness on Short Text Clustering Performance0
Effects of Word Embeddings on Neural Network-based Pitch Accent Detection0
Efficient Contextual Representation Learning Without Softmax Layer0
Efficient Data Selection for Bilingual Terminology Extraction from Comparable Corpora0
Efficient Extraction of Pseudo-Parallel Sentences from Raw Monolingual Data Using Word Embeddings0
Efficient Non-parametric Estimation of Multiple Embeddings per Word in Vector Space0
EICA Team at SemEval-2018 Task 2: Semantic and Metadata-based Features for Multilingual Emoji Prediction0
Show:102550
← PrevPage 374 of 401Next →

No leaderboard results yet.