SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 27812790 of 4002 papers

TitleStatusHype
Lexicon-Enhancement of Embedding-based Approaches Towards the Detection of Abusive Language0
Lexicon Infused Phrase Embeddings for Named Entity Resolution0
Lexicon Integrated CNN Models with Attention for Sentiment Analysis0
LIA at SemEval-2017 Task 4: An Ensemble of Neural Networks for Sentiment Classification0
Lifelong Word Embedding via Meta-Learning0
LightRel at SemEval-2018 Task 7: Lightweight and Fast Relation Classification0
LightRel SemEval-2018 Task 7: Lightweight and Fast Relation Classification0
Lightweight Efficient Multi-keyword Ranked Search over Encrypted Cloud Data using Dual Word Embeddings0
Limbic: Author-Based Sentiment Aspect Modeling Regularized with Word Embeddings and Discourse Relations0
LIMSI-COT at SemEval-2016 Task 12: Temporal relation identification using a pipeline of classifiers0
Show:102550
← PrevPage 279 of 401Next →

No leaderboard results yet.