SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 35813590 of 4002 papers

TitleStatusHype
Leveraging Deep Graph-Based Text Representation for Sentiment Polarity Applications0
DeepTC -- An Extension of DKPro Text Classification for Fostering Reproducibility of Deep Learning Experiments0
DeepXML: Scalable & Accurate Deep Extreme Classification for Matching User Queries to Advertiser Bid Phrases0
"Definition Modeling: To model definitions." Generating Definitions With Little to No Semantics0
Delexicalized Word Embeddings for Cross-lingual Dependency Parsing0
Delta Embedding Learning0
De-Mixing Sentiment from Code-Mixed Text0
Demographic Word Embeddings for Racism Detection on Twitter0
Demonstration of a Literature Based Discovery System based on Ontologies, Semantic Filters and Word Embeddings for the Raynaud Disease-Fish Oil Rediscovery0
Denoising Word Embeddings by Averaging in a Shared Space0
Show:102550
← PrevPage 359 of 401Next →

No leaderboard results yet.