SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 25012525 of 4002 papers

TitleStatusHype
Predictive Embeddings for Hate Speech Detection on Twitter0
Using Word Embeddings to Explore the Learned Representations of Convolutional Neural Networks0
Learning and Evaluating Sparse Interpretable Sentence Embeddings0
Predicting the Argumenthood of English Prepositional Phrases0
FRAGE: Frequency-Agnostic Word RepresentationCode0
Meta-Embedding as Auxiliary Task Regularization0
Macquarie University at BioASQ 6b: Deep learning and deep reinforcement learning for query-based multi-document summarisation0
Incorporating Syntactic and Semantic Information in Word Embeddings using Graph Convolutional NetworksCode0
Distilled Wasserstein Learning for Word Embedding and Topic Modeling0
Generalizing Word Embeddings using Bag of SubwordsCode0
AWE: Asymmetric Word Embedding for Textual Entailment0
xSense: Learning Sense-Separated Sparse Representations and Textual Definitions for Explainable Word Sense NetworksCode0
Unsupervised Cross-lingual Transfer of Word Embedding SpacesCode0
SHOMA at Parseme Shared Task on Automatic Identification of VMWEs: Neural Multiword Expression Tagging with High GeneralisationCode0
Exploration on Grounded Word Embedding: Matching Words and Images with Image-Enhanced Skip-Gram Model0
Unsupervised Cross-lingual Word Embedding by Multilingual Neural Language Models0
Learning Embeddings of Directed Networks with Text-Associated Nodes---with Applications in Software Package Dependency Networks0
Uncovering divergent linguistic information in word embeddings with lessons for intrinsic and extrinsic evaluationCode0
An Analysis of Hierarchical Text Classification Using Word Embeddings0
Sentylic at IEST 2018: Gated Recurrent Neural Network and Capsule Network Based Approach for Implicit Emotion Detection0
Utilizing Character and Word Embeddings for Text Normalization with Sequence-to-Sequence Models0
Firearms and Tigers are Dangerous, Kitchen Knives and Zebras are Not: Testing whether Word Embeddings Can Tell0
Segmentation-free Compositional n-gram EmbeddingCode0
NTUA-SLP at IEST 2018: Ensemble of Neural Transfer Methods for Implicit Emotion ClassificationCode0
Deep learning for language understanding of mental health concepts derived from Cognitive Behavioural TherapyCode0
Show:102550
← PrevPage 101 of 161Next →

No leaderboard results yet.