SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 21712180 of 4002 papers

TitleStatusHype
Retrieving Multi-Entity Associations: An Evaluation of Combination Modes for Word Embeddings0
Deeper Text Understanding for IR with Contextual Neural Language ModelingCode0
Domain adaptation for part-of-speech tagging of noisy user-generated text0
SuperTML: Domain Transfer from Computer Vision to Structured Tabular Data through Two-Dimensional Word Embedding0
Tracing cultural diachronic semantic shifts in Russian using word embeddings: test sets and baselinesCode0
Syntax-Enhanced Neural Machine Translation with Syntax-Aware Word Representations0
Learning Embeddings into Entropic Wasserstein SpacesCode0
Models in the Wild: On Corruption Robustness of NLP Systems0
HHMM at SemEval-2019 Task 2: Unsupervised Frame Induction using Contextualized Word EmbeddingsCode0
A Typedriven Vector Semantics for Ellipsis with Anaphora using Lambek Calculus with Limited Contraction0
Show:102550
← PrevPage 218 of 401Next →

No leaderboard results yet.