SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 34513460 of 4002 papers

TitleStatusHype
A Systematic Comparison of English Noun Compound RepresentationsCode0
Efficient Exact Gradient Update for training Deep Networks with Very Large Sparse TargetsCode0
Learning and Evaluating Character Representations in NovelsCode0
Object Priors for Classifying and Localizing Unseen ActionsCode0
Clustering Word Embeddings with Self-Organizing Maps. Application on LaRoSeDa -- A Large Romanian Sentiment Data SetCode0
Efficient Structured Inference for Transition-Based Parsing with Neural Networks and Error StatesCode0
Efficient Vector Representation for Documents through CorruptionCode0
EF-Net: A Deep Learning Approach Combining Word Embeddings and Feature Fusion for Patient Disposition AnalysisCode0
A Systematic Comparison of Contextualized Word Embeddings for Lexical Semantic ChangeCode0
EigenSent: Spectral sentence embeddings using higher-order Dynamic Mode DecompositionCode0
Show:102550
← PrevPage 346 of 401Next →

No leaderboard results yet.