SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 13111320 of 4002 papers

TitleStatusHype
VCDM: Leveraging Variational Bi-encoding and Deep Contextualized Word Representations for Improved Definition ModelingCode0
Analogies minus analogy test: measuring regularities in word embeddingsCode0
Metaphor Interpretation Using Word Embeddings0
Compositional Demographic Word EmbeddingsCode1
Using Sentences as Semantic Representations in Large Scale Zero-Shot Learning0
Robustness and Reliability of Gender Bias Assessment in Word Embeddings: The Role of Base PairsCode0
Embedding Words in Non-Vector Space with Unsupervised Graph LearningCode1
Intrinsic Probing through Dimension SelectionCode0
On the Effects of Knowledge-Augmented Data in Word Embeddings0
PublishInCovid19 at WNUT 2020 Shared Task-1: Entity Recognition in Wet Lab Protocols using Structured Learning Ensemble and Contextualised Embeddings0
Show:102550
← PrevPage 132 of 401Next →

No leaderboard results yet.