SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 26612670 of 4002 papers

TitleStatusHype
Variable-Bitrate Neural Compression via Bayesian Arithmetic Coding0
Variable Mini-Batch Sizing and Pre-Trained Embeddings0
Variance of Twitter Embeddings and Temporal Trends of COVID-19 cases0
Variational Gaussian Topic Model with Invertible Neural Projections0
Varying Linguistic Purposes of Emoji in (Twitter) Context0
Vec2Node: Self-training with Tensor Augmentation for Text Classification with Few Labels0
VecShare: A Framework for Sharing Word Representation Vectors0
Vectorial Semantic Spaces Do Not Encode Human Judgments of Intervention Similarity0
Vector representations of text data in deep learning0
VectorWeavers at SemEval-2016 Task 10: From Incremental Meaning to Semantic Unit (phrase by phrase)0
Show:102550
← PrevPage 267 of 401Next →

No leaderboard results yet.