SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 27512760 of 4002 papers

TitleStatusHype
UWB at SemEval-2018 Task 1: Emotion Intensity Detection in Tweets0
Pruning Basic Elements for Better Automatic Evaluation of Summaries0
What the Vec? Towards Probabilistically Grounded Embeddings0
Multi-turn Dialogue Response Generation in an Adversarial Learning Framework0
Quantum-inspired Complex Word Embedding0
Unsupervised Alignment of Embeddings with Wasserstein ProcrustesCode0
Unsupervised detection of diachronic word sense evolution0
Convolutional neural networks for chemical-disease relation extraction are improved with character-based word embeddings0
UMDuluth-CS8761 at SemEval-2018 Task 9: Hypernym Discovery using Hearst Patterns, Co-occurrence frequencies and Word Embeddings0
Lifelong Domain Word Embedding via Meta-LearningCode0
Show:102550
← PrevPage 276 of 401Next →

No leaderboard results yet.