SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 17111720 of 4002 papers

TitleStatusHype
Hate speech detection using static BERT embeddings0
HCCL at SemEval-2017 Task 2: Combining Multilingual Word Embeddings and Transliteration Model for Semantic Similarity0
Analogy-based detection of morphological and semantic relations with word embeddings: what works and what doesn't.0
HECTOR: A Hybrid TExt SimplifiCation TOol for Raw Texts in French0
Hybrid Code Networks using a convolutional neural network as an input layer achieves higher turn accuracy0
HG2Vec: Improved Word Embeddings from Dictionary and Thesaurus Based Heterogeneous Graph0
HGSGNLP at IEST 2018: An Ensemble of Machine Learning and Deep Neural Architectures for Implicit Emotion Classification in Tweets0
Context-Aware Neural Machine Translation Decoding0
HHU at SemEval-2016 Task 1: Multiple Approaches to Measuring Semantic Textual Similarity0
Hybrid Improved Document-level Embedding (HIDE)0
Show:102550
← PrevPage 172 of 401Next →

No leaderboard results yet.