SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 20412050 of 4002 papers

TitleStatusHype
Random Positive-Only Projections: PPMI-Enabled Incremental Semantic Space Construction0
Random Walks and Neural Network Language Models on Knowledge Bases0
Ranking Convolutional Recurrent Neural Networks for Purchase Stage Identification on Imbalanced Twitter Data0
Ranking Kernels for Structures and Embeddings: A Hybrid Preference and Classification Model0
RankMat : Matrix Factorization with Calibrated Distributed Embedding and Fairness Enhancement0
Rare Tokens Degenerate All Tokens: Improving Neural Text Generation via Adaptive Gradient Gating for Rare Token Embeddings0
RAW-C: Relatedness of Ambiguous Words in Context (A New Lexical Resource for English)0
Reading Between the Lines: Overcoming Data Sparsity for Accurate Classification of Lexical Relationships0
Realised Volatility Forecasting: Machine Learning via Financial Word Embedding0
Real Multi-Sense or Pseudo Multi-Sense: An Approach to Improve Word Representation0
Show:102550
← PrevPage 205 of 401Next →

No leaderboard results yet.