SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 21762200 of 4002 papers

TitleStatusHype
Syntax-Enhanced Neural Machine Translation with Syntax-Aware Word Representations0
Learning Embeddings into Entropic Wasserstein SpacesCode0
Models in the Wild: On Corruption Robustness of NLP Systems0
HHMM at SemEval-2019 Task 2: Unsupervised Frame Induction using Contextualized Word EmbeddingsCode0
A Typedriven Vector Semantics for Ellipsis with Anaphora using Lambek Calculus with Limited Contraction0
Pretrained Transformers for Simple Question Answering0
Investigating the Stability of Concrete Nouns in Word Embeddings0
Predicting Word Concreteness and Imagery0
Words are Vectors, Dependencies are Matrices: Learning Word Embeddings from Dependency Graphs0
On Learning Word Embeddings From Linguistically Augmented Text Corpora0
Detecting Paraphrases of Standard Clause Titles in Insurance Contracts0
A Comparison of Context-sensitive Models for Lexical Substitution0
Semantic Frame Embeddings for Detecting Relations between Software Requirements0
Aligning Open IE Relations and KB Relations using a Siamese Network Based on Word Embedding0
Language-Agnostic Model for Aspect-Based Sentiment Analysis0
Distribution is not enough: going Firther0
Nested Variational Autoencoder for Topic Modeling on Microtexts with Word VectorsCode0
Wasserstein Barycenter Model Ensembling0
Zero-training Sentence Embedding via Orthogonal BasisCode0
RelWalk -- A Latent Variable Model Approach to Knowledge Graph Embedding0
The Effectiveness of Pre-Trained Code Embeddings0
Unsupervised Hyper-alignment for Multilingual Word Embeddings0
Encoding Category Trees Into Word-Embeddings Using Geometric ApproachCode0
Learning Mixed-Curvature Representations in Product Spaces0
Poincare Glove: Hyperbolic Word EmbeddingsCode0
Show:102550
← PrevPage 88 of 161Next →

No leaderboard results yet.