SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 22012210 of 4002 papers

TitleStatusHype
Don't Settle for Average, Go for the Max: Fuzzy Sets and Max-Pooled Word VectorsCode0
On the Effect of Low-Frequency Terms on Neural-IR ModelsCode0
Enabling Open-World Specification Mining via Unsupervised Learning0
Are We Consistently Biased? Multidimensional Analysis of Biases in Distributional Word VectorsCode0
Contextualized Word Embeddings Enhanced Event Temporal Relation Extraction for Story Understanding0
Better Automatic Evaluation of Open-Domain Dialogue Systems with Contextualized Embeddings0
A bag-of-concepts model improves relation extraction in a narrow knowledge domain with limited data0
Integrating Social Media into a Pan-European Flood Awareness System: A Multilingual Approach0
Understanding the Stability of Medical Concept Embeddings0
Weakly-Supervised Concept-based Adversarial Learning for Cross-lingual Word Embeddings0
Show:102550
← PrevPage 221 of 401Next →

No leaderboard results yet.