SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 33113320 of 4002 papers

TitleStatusHype
Predicting Role Relevance with Minimal Domain Expertise in a Financial Domain0
An Empirical Analysis of NMT-Derived Interlingual Embeddings and their Use in Parallel Sentence Identification0
Baselines and test data for cross-lingual inferenceCode0
FEUP at SemEval-2017 Task 5: Predicting Sentiment Polarity and Intensity with Financial Word EmbeddingsCode0
How Robust Are Character-Based Word Embeddings in Tagging and MT Against Wrod Scramlbing or Randdm Nouse?0
Incremental Skip-gram Model with Negative SamplingCode0
ConceptNet at SemEval-2017 Task 2: Extending Word Embeddings with Multilingual Relational KnowledgeCode2
Exploring Word Embeddings for Unsupervised Textual User-Generated Content Normalization0
Entity Linking for Queries by Searching Wikipedia Sentences0
Word Embeddings via Tensor FactorizationCode0
Show:102550
← PrevPage 332 of 401Next →

No leaderboard results yet.