SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 28512875 of 4002 papers

TitleStatusHype
Knowing the Author by the Company His Words Keep0
KIT-Multi: A Translation-Oriented Multilingual Embedding Corpus0
Joint Learning of Sense and Word Embeddings0
Factors Influencing the Surprising Instability of Word EmbeddingsCode0
Automated Detection of Adverse Drug Reactions in the Biomedical Literature Using Convolutional Neural Networks and Biomedical Word Embeddings0
DeepEmo: Learning and Enriching Pattern-Based Emotion RepresentationsCode0
Can Eye Movement Data Be Used As Ground Truth For Word Embeddings Evaluation?0
Linguistically-Informed Self-Attention for Semantic Role LabelingCode0
Bilingual Embeddings with Random Walks over Multilingual Wordnets0
NE-Table: A Neural key-value table for Named EntitiesCode0
Dynamic Meta-Embeddings for Improved Sentence RepresentationsCode0
Automated essay scoring with string kernels and word embeddings0
A Deep Representation Empowered Distant Supervision Paradigm for Clinical Information Extraction0
Probabilistic Word Association for Dialogue Act Classification with Recurrent Neural NetworksCode0
LightRel SemEval-2018 Task 7: Lightweight and Fast Relation Classification0
NTUA-SLP at SemEval-2018 Task 2: Predicting Emojis using RNNs with Context-aware AttentionCode0
NTUA-SLP at SemEval-2018 Task 1: Predicting Affective Content in Tweets with Deep Attentive RNNs and Transfer LearningCode0
NTUA-SLP at SemEval-2018 Task 3: Tracking Ironic Tweets using Ensembles of Word and Character Level Attentive RNNsCode0
When and Why are Pre-trained Word Embeddings Useful for Neural Machine Translation?Code0
A Deeper Look into Dependency-Based Word Embeddings0
Frustratingly Easy Meta-Embedding -- Computing Meta-Embeddings by Averaging Source Word EmbeddingsCode0
Amobee at SemEval-2018 Task 1: GRU Neural Network with a CNN Attention Mechanism for Sentiment Classification0
Word2Vec applied to Recommendation: Hyperparameters MatterCode0
Evaluating Word Embedding Hyper-Parameters for Similarity and Analogy Tasks0
Exploiting Task-Oriented Resources to Learn Word Embeddings for Clinical Abbreviation Expansion0
Show:102550
← PrevPage 115 of 161Next →

No leaderboard results yet.